Technology & Engineering
Nonlinear Regression
Nonlinear regression is a statistical method used to model complex relationships between variables that cannot be adequately represented by a linear model. It allows for the fitting of curves and other nonlinear patterns to the data, making it a valuable tool in engineering and technology for analyzing and predicting non-linear phenomena.
Written by Perlego with AI-assistance
Related key terms
1 of 5
9 Key excerpts on "Nonlinear Regression"
- eBook - PDF
- Eugene Demidenko(Author)
- 2019(Publication Date)
- Wiley(Publisher)
Chapter 9 Nonlinear Regression Nonlinear Regression is a powerful statistical tool rarely covered in traditional statistics textbooks. Nonlinear Regression is not only a practically important technique, but also an important example of a real-life statistical model where classical theory of unbiased estimation and sufficient statistics do not work. Un- doubtedly, the linear model is the champion among statistical techniques when it comes to modeling relationships between variables. However, sometimes the association is not linear such as when the response has a sigmoid shape — then Nonlinear Regression must be applied. Unlike linear regression, nonlinear regres- sion is a complex statistical model where small sample properties are difficult to study — here, we rely on asymptotic properties. The major method of esti- mation is the nonlinear least squares, which, unlike linear least squares, requires iterations. Various numerical issues arise in the Nonlinear Regression model: (1) finding satisfactory starting value, (2) existence of the solution of the nonlinear least squares, (3) multiple local minima for the residual sum of squares. This chapter covers major concepts of Nonlinear Regression, illustrated with various examples, and its implementation in R. The chapter ends with the design of ex- periments emerging in engineering or chemical sciences where the values of the independent variable may be chosen by the experimentalist. 9.1 Definition and motivating examples Nonlinear Regression is an obvious generalization of linear regression (linear model). The main difference is that, in Nonlinear Regression, the expected value of the de- pendent variable is a nonlinear function of parameters. For example, quadratic regression = 0 + 1 + 2 2 + Advanced Statistics with Applications in R, First Edition. Eugene Demidenko. c ° 2020 John Wiley & Sons, Inc. Published 2020 by John Wiley & Sons, Inc. 741 - eBook - ePub
- Douglas C. Montgomery, Elizabeth A. Peck, G. Geoffrey Vining(Authors)
- 2021(Publication Date)
- Wiley(Publisher)
CHAPTER 12 INTRODUCTION TO Nonlinear RegressionLinear regression models provide a rich and flexible framework that suits the needs of many analysts. However, linear regression models are not appropriate for all situations. There are many problems in engineering and the sciences where the response variable and the predictor variables are related through a known nonlinear function. This leads to a Nonlinear Regression model. When the method of least squares is applied to such models, the resulting normal equations are nonlinear and, in general they can be difficult to solve. The usual approach is to directly minimize the residual sum of squares by an iterative procedure. In this chapter we describe estimating the parameters in a Nonlinear Regression model and show how to make appropriate inferences on the model parameters. We also illustrate computer software for Nonlinear Regression.12.1 LINEAR AND Nonlinear Regression MODELS
12.1.1 Linear Regression Models
In previous chapters we have concentrated on the linear regression model(12.1)These models include not only the first-order relationships, such as Eq. (12.1) , but also polynomial models and other more complex relationships. In fact, we could write the linear regression model as(12.2)where zi represents any function of the original regressors x1 , x2 , …, xk , including transformations such as exp(xi ), , and sin(xi ). These models are called linear regression models because they are linear in the unknown parameters, the βj , j = 1, 2, …, k.We may write the linear regression model (12.1) in a general form as(12.3)where x′ = [1, x1 , x2 , …, xk ]. Since the expected value of the model errors is zero, the expected value of the response variable isWe usually refer to f(x, β) as the expectation function - Claus Thorn Ekstrom, Helle Sørensen(Authors)
- 2014(Publication Date)
- Chapman and Hall/CRC(Publisher)
Chapter 9 Non-linear regression Until now the focus has been on linear models where the mean of the out-come (or possibly a transformation of the response) is a linear function of the unknown parameters from the statistical model. In this chapter we con-sider the more general class of non-linear regression models , where some of the parameters in the model do not have a linear relationship with the mean response. Yet the aims of the analysis are the same: estimation of the param-eters from the non-linear functional relationship and quantification of the uncertainty of the parameters so it is possible to create confidence intervals and test hypotheses. The most important advantage of non-linear regression models, com-pared to linear models, is that our class of models to describe the mean re-sponse is much larger. In fact, linear models is a special case of non-linear regression models. Secondly, non-linear regression models allow us to create models that are biologically more plausible in the sense that they more di-rectly describe the underlying data-generating system (e.g., the underlying biological system) that we try to model. This improves the understanding of the model and implies that the parameters in the model often have very specific interpretations. The price we pay for having a more flexible model is in the model fitting procedure, which — unlike for linear models — often requires that we have some good guesses about the true value of the param-eters in the model to ensure that we obtain the best fitting model. Another price is that there are only approximate (not exact) and asymptotic results about the distribution of parameter estimates and test statistics. Fortunately, when the sample is not small, this is more a theoretical than a practial issue. Non-linear regression is frequently used, for example, for growth data (i.e., time-size relationships), concentration-response relationships in chem-istry and toxicology, bioassay modeling, and economics.- Anders Rasmuson, Bengt Andersson, Louise Olsson, Ronnie Andersson(Authors)
- 2014(Publication Date)
- Cambridge University Press(Publisher)
7 Statistical analysis of mathematical models 7.1 Introduction In chemical engineering, mathematical modeling is crucial in order to design equipment, choose proper operating conditions, regulate processes, etc. It is almost always necessary to use experimental data for model development. Figure 7.1(a) shows a data set and linear fit for these data. From this result, is easy to see that this line describes these data. However, from the data set shown in Figure 7.1(b), this is not so clear. The solid line represents the linear fit for these data, which is derived from regression analysis. By simply observing the data, it can be seen that either of the dashed lines could be possible fits. These results clearly show that it is not possible to determine parameters for models only by which line looks a good fit, but that a detailed statistical analysis is needed. In this chapter, we start by describing linear regression, which is a method for deter- mining parameters in a model. The accuracy of the parameters can be estimated by confi- dence intervals and regions, which will be discussed in Section 7.5. Correlation between parameters is often a major problem for large mathematical models, and the determi- nation of so-called correlation matrices will be described. In more complex chemical engineering models, non-linear regression is required, and this is also described in this chapter. 7.2 Linear regression Regression analysis is a statistical method for determining parameters in models. The simplest form is a first-order straight-line model, which can be described by y = β 0 + β 1 x + ε, (7.1) where y is the dependent variable that can also be called the response variable; x is the predictor of y, also known as the regressor or independent variable; β i are parameters; ε is the stochastic part, and describes the random error. In this chapter, the vectors and matrices will be written in bold italic style.- eBook - PDF
Regression Analysis and Linear Models
Concepts, Applications, and Implementation
- Richard B. Darlington, Andrew F. Hayes(Authors)
- 2016(Publication Date)
- The Guilford Press(Publisher)
12 Nonlinear Relationships Assuming linearity between two variables when modeling their relation- ship often results in reasonably good models that are useful and easy to interpret. But sometimes we have reason to believe a relationship is not linear, or the evidence compels us to accept that it is not. In spite of its name, linear regression analysis can be used to model rela- tionships that are better described with curves than with straight lines. In this chapter we discuss reasons you might choose to fit a curve to a relationship rather than a straight line, and we show how to detect nonlinearity visually as well as using polynomial regression. We also give a brief overview of spline regression, an interesting extension of regression analysis that allows for chaining of line or curve segments to capture complex forms of nonlinearity. We end with a discussion of transformations, often used to make nonlinear relationships approximate linear ones. 12.1 Linear Regression Can Model Nonlinear Relationships Relationships between variables are sometimes better described with curves than with straight lines. A graph showing world population on the vertical axis against time on the horizontal axis would constantly curve upward, with the growth accelerating rapidly with time. Human height against age rises more slowly during childhood than in the early teen years but levels off later. A plot of “commitment to democracy” versus the extent to which a person identifies as politically conservative versus politically liberal might show greater commitment among those in the middle of the ideology continuum than among those on either the liberal or the con- servative end of the spectrum. Desire to acquire more money might be especially high among people who have very little, slowly drop off as in- 341 342 Regression Analysis and Linear Models come increases, and perhaps climb again among people who are already very wealthy. - Joseph Ofungwu(Author)
- 2014(Publication Date)
- Wiley(Publisher)
However, Nonlinear Regression also has limitations, including (1) the form of the nonlinear model has to be specified by the analyst and although there are several “standard” nonlinear statistical models available (e.g., exponential growth or decay model and power model), it is not always easy to find a suitable model for the data in question. It is possible to find computer programs that can automatically fit several equations or models to the data and then choose the best-fitting equation, but such a model may not be meaningful because the computer obviously has no knowledge of the process being modeled; (2) unlike linear regression where “closed-form” solutions can always be obtained analytically for the regression coefficients, closed-form solutions are usually not possible with Nonlinear Regression, necessitating approximate solutions using numerical, iterative methods instead- eBook - PDF
- D.N. Rutledge(Author)
- 1996(Publication Date)
- Elsevier Science(Publisher)
However, they are difficult to manipulate manually, and complicated for some researchers who sometimes devise complicated means to circumvent the use of them. Examples of this are the Dixon plot to study enzyme inhibition (Dixon and Webb, 1979), or the separation of sums of exponential terms into components by subtraction in decay experiments or in compartmental analysis (Prince et al., 1980). Not only is this unnecessarily painful, but it is also very dangerous, since it can lead to gross errors in the evaluation of parameter values. Functions can also be linear or nonlinear with respect to their independent variables. All kinds of combinations exist concerning linearity or nonlinearity with respect to the variables and with respect to the parameters. However, only the latter are relevant in regression. When we say that a model is nonlinear in regression we always refer to the parameters. This is a concept that must be made clear, because it is a common cause of mistakes. 1.2 Aims and uses of regression analysis Nonlinear Regression can be used in three ways, and these may or may not coincide in a single experimental study (Fig. 2): (1) to test the validity of the model (or to contrast the hypothesis); (2) to characterize the model (in other words, to estimate parameters); and (3) to predict the behaviour of the system (interpolation and calibration). 72 PURPOSES OF NONLINEAL REGRESSION 9 Validate or compare models (hypothesis testing) 9 Characterize the model (parameter estimation) 9 Predict the behaviour (interpolation/extrapolation and calibration) Fig. 2: Main purposes of curve fitting or Nonlinear Regression. Model validation or comparison is one important application of regression analysis. - eBook - PDF
Introduction to Multivariate Analysis
Linear and Nonlinear Modeling
- Sadanori Konishi(Author)
- 2014(Publication Date)
- Chapman and Hall/CRC(Publisher)
Chapter 3 Nonlinear Regression Models In Chapter 2 we discussed the basic concepts of modeling and con-structed linear regression models for phenomena with linear structures. We considered, in particular, the series of processes in regression model-ing; specification of linear models to ascertain relations between vari-ables, estimation of the parameters of the specified models by least squares or maximum likelihood, and evaluation of the estimated models to select the most appropriate model. For linear regression models hav-ing a large number of predictor variables, the usual methods of separat-ing model estimation and evaluation are ine ffi cient for the construction of models with high reliability and prediction. We introduced various regularization methods with an L 1 penalty term in addition to the sum of squared errors and log-likelihood functions. We now consider the types of models to assume for analysis of phe-nomena containing complex nonlinear structures. In assessing the degree of impact on the human body in a car collision, as a basic example, it is necessary to accumulate and analyze experimental data and from them develop models. The measured and observed data are quite complex and therefore di ffi cult to comprehend, however, in contrast to the types of data amenable to the specific functional forms such as linear and poly-nomial models. In this chapter, we consider the resolution of this prob-lem by organizing the basic concepts of regression modeling into a more general framework and extending linear models to nonlinear models for the extraction of information from data with complex structures. 3.1 Modeling Phenomena We first consider the general modeling process for a certain phenomenon. Suppose that the observed n -set of data relating to the predictor variable x and the response variable y are given by { ( x i , y i ); i = 1 , 2 , · · · , n } . - No longer available |Learn more
- Anthony Hayter(Author)
- 2012(Publication Date)
- Cengage Learning EMEA(Publisher)
C H A P T E R T H I R T E E N Multiple Linear Regression and Nonlinear Regression A multiple linear regression model is an extension of a simple linear regression model that allows the response variable y to be modeled as a linear function of more than one input variable x i . The ideas and concepts behind multiple linear regression are similar to those discussed in Chapter 12 for simple linear regression, although the computations are considerably more complex. The best way to deal with multiple linear regression mathematically is with a matrix algebra approach, which is discussed in Section 13.3. Nevertheless, the discussion of multiple linear regression and the examples presented in Sections 13.1, 13.2, and 13.4 can be read without an understanding of Section 13.3. Nonlinear Regression models, discussed in Section 13.5, are models in which the expected value of the response variable y is not expressed as a linear combination of the parameters. Even though many of the ideas in Nonlinear Regression are similar to those in linear regression, the mathematical processes required to fit the models and to make statistical inferences are different. 13.1 Introduction to Multiple Linear Regression 13.1.1 The Multiple Linear Regression Model Consider the problem of modeling a response variable y as a function of k input variables x 1 ,..., x k based upon a data set consisting of the n sets of values ( y 1 , x 11 , x 21 ,..., x k 1 ) . . . ( y n , x 1 n , x 2 n ,..., x kn ) Thus, y i is the value taken by the response variable y for the i th observation, which is obtained with values x 1 i , x 2 i ,..., x ki of the k input variables x 1 , x 2 ,..., x k . In multiple linear regression, the value of the response variable y i is modeled as y i = β 0 + β 1 x 1 i + ··· + β k x ki + i which consists of a linear combination β 0 + β 1 x 1 i + ··· + β k x ki of the corresponding values of the input variables together with an error term i .
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.








