Home » FM Least Squares

# FM Least Squares

## 3.6 Least Squares Regression for Transformed Data

Note: if you cannot remember how to interpret least squares regression lines, revise notes for 3.3 Using the Formula for a Fitted Line.

### Guideline to Analyse Least Squares Linear Regression Relationships for Transformed Data

• Analysing a least squares linear fit for transformed data is similar to the process for non-transformed data, however you must keep in mind the association is not between the explanatory variable and response variable, it is between the transformed variable and the non-transformed variable (which will be either the explanatory or response variable).
• When interpreting the meaning of the coefficient of determination, it gives an indication of what percentage of variation in the transformed variable is explained by variation of the non-transformed variable, or visa-versa (e.g. for an explanatory variable squared transformation, the coefficient indicates what percentage of variation in y can be explained by variation in x^2).
Read More »3.6 Least Squares Regression for Transformed Data

## 3.4 Further Measures for Association Strength

### Coefficient of Determination; r^2

• The coefficient of determination gives a quantitative way of determining how much of the variation of the response variable is explained by variation in the explanatory variable.
• It is represented by a lower-case r with a 2 superscript and can be calculated by squaring the correlation coefficient:

r^{2}=\left(\frac{\sum_{i=1}^{n}\left(x_{i}-\bar{x}\right)\left(y_{i}-\bar{y}\right)}{(n-1) s_{x} s_{y}}\right)^{2}

• When calculating the coefficient of determination, you will get a decimal answer, however, when interpreting the value, you should convert it into a percentage (multiply by 100).
Read More »3.4 Further Measures for Association Strength

## 3.3 Using the Formula for a Fitted Line

Interpolation After fitting a model to a dataset (through linear regression), we can use that model to estimate values we don’t have data points for.… Read More »3.3 Using the Formula for a Fitted Line

## 3.2 Modelling Linear Associations

### Identifying Explanatory and Response Variables

• It is important to correctly select the explanatory and response variables when using regression, or the relationship will be incorrect.
• The explanatory variable is the variable which is used to explain or predict the response variable.
• In a conventional x-y dataset, the x variable is the explanatory variable and y is the response variable.

### Fitting Least Squares Models

• Start by identifying the explanatory and response variables.
Read More »3.2 Modelling Linear Associations

## 3.1 Least Squares Linear Regression

### The Idea behind Least Squares Regression

• In order to conveniently estimate the expected values of one variable based on another, we often create a mathematical model which fits, as closely as possible, the data we have collected. In Further Maths, we will only deal with linear regression, where we try to come up with a straight line that fits our data.
• In least squares regression, we try to find that “best fit” by finding a line that minimises the value of the sum of squared residuals (i.e. we take the difference between each datapoint and the line, then square each and add them all together).
• The resulting line is of the form

y=a+bx

where y and x are the response and explanatory variables, respectively, and a and b are constants which must be determined.

• Least squares linear regression is only appropriate if:
Read More »3.1 Least Squares Linear Regression