Key difference - correlation vs. regression
Correlation and regression are two methods used to study the relationship between variables in statistics. The main difference between correlation and regression is that correlation measures the degree of relationship between the two variables , while regression is a method of describing the relationship between two variables . Regression also allows a more accurate prediction of the value the dependent variable would take for a given value of the independent variable.
What is correlation?
In statistics, we say that there is a correlation between two variables when the two variables are related. If the relationship between the variables is linear, we can express the degree of their relationship in terms of a number called the Pearson correlation coefficient .
takes a value between -1 and 1. A value of 0 means that the two variables are not correlated. Negative values indicate that the correlation between the variables is negative: that is, as one variable increases, the other variable decreases. Likewise is a positive value for
means that the data are positively correlated (if one variable increases, the other variable increases as well).
A value of ie -1 or 1 gives the strongest possible correlation. When
the variables are completely negatively correlated and if
the values should be completely positively correlated. The following figure shows different forms of scatter plot between two variables and the correlation coefficients for each case:
Pearson's Correlation Coefficient for Different Types of Scatterplots
Pearson's correlation coefficient for two variables and
is defined as follows:
Here, is the covariance between
and
:
The conditions and
stand for standard deviations of
and
or this is defined as:
and
Let's look at an example of how the correlation coefficient is calculated. We will try to calculate the correlation coefficient for the following set of 20 values for and
:
-0.9557 | 0.5369 |
-1.6441 | -0.1560 |
1.2254 | 1.9230 |
1.9062 | 1.9957 |
1.9679 | 2.1673 |
-0.3469 | 0.7954 |
-0.2328 | 0.5415 |
1.5064 | 1.2335 |
0.4278 | 0.7754 |
-0.6359 | 0.3534 |
0.0061 | 0.7565 |
0.8407 | 1.5326 |
0.2713 | 1.3354 |
0.4664 | 1.9980 |
-0.1813 | 1.2539 |
1.4384 | 2.0383 |
1.9001 | 2.7755 |
0.1022 | 0.7861 |
0.1251 | 0.7456 |
-0.6314 | 0.9942 |
The values of are plotted against the values of
in the graphic shown below:
When we look at the equations needed to calculate the correlation coefficient, we first calculate values for . These are the mean values of
and
or we believe that:
Next we calculate and
. We will put these values next to our values of
and
in the table above:
-0.9557 | 0.5369 | -0.5131 | 1.7782 | 0.4654 |
-1.6441 | -0.1560 | 0.2565 | 4.0881 | 1.8909 |
1.2254 | 1.9230 | 2.3564 | 0.7184 | 0.4955 |
1.9062 | 1.9957 | 3.8042 | 2.3360 | 0.6031 |
1.9679 | 2.1673 | 4.2650 | 2.5284 | 0.8991 |
-0.3469 | 0.7954 | -0.2759 | 0.5252 | 0.1795 |
-0.2328 | 0.5415 | -0.1261 | 0.3728 | 0.4592 |
1.5064 | 1.2335 | 1.8581 | 1.2737 | 0.0002 |
0.4278 | 0.7754 | 0.3317 | 0.0025 | 0.1969 |
-0.6359 | 0.3534 | -0.2247 | 1.0276 | 0.7495 |
0.0061 | 0.7565 | 0.0046 | 0.1382 | 0.2140 |
0.8407 | 1.5326 | 1.2885 | 0.2143 | 0.0983 |
0.2713 | 1.3354 | 0.3623 | 0.0113 | 0.0135 |
0.4664 | 1.9980 | 0.9319 | 0.0079 | 0.6067 |
-0.1813 | 1.2539 | -0.2273 | 0.3126 | 0.0012 |
1.4384 | 2.0383 | 2.9319 | 1.1249 | 0.6711 |
1.9001 | 2.7755 | 5.2737 | 2.3174 | 2.4223 |
0.1022 | 0.7861 | 0.0803 | 0.0760 | 0.1875 |
0.1251 | 0.7456 | 0.0933 | 0.0639 | 0.2242 |
-0.6314 | 0.9942 | -0.6277 | 1.0185 | 0.0506 |
With these values we can calculate the covariance:
We can also calculate the standard deviations:
Now we can calculate the correlation coefficient:
What is regression
Regression is a method of finding the relationship between two variables. Specifically, we consider linear regression , which provides an equation for a “best-fit line” for a given sample of data in which two variables have a linear relationship. A straight line can be described with an equation in the form Where
is the slope of the line and
Axis, and linear regression allows us to calculate the values of
and
. After we have calculated the correlation coefficient
, we can calculate these values as:
Note that in these cases is assumed to be the dependent variable while
is the independent variable. We know this from our calculations so far
,
and
. Because of this,
.
and
. Because of this,
.
The image below shows the previous scatter plot with the line :
As mentioned earlier, regression analysis helps us make predictions. For example, if the value of the independent variable ( ) was 1,000, then we can predict that
would be close
. In reality, the value of
doesn't necessarily have to be exactly 1.614. Due to the uncertainty, the actual value is likely to differ. Note that the accuracy of the prediction is higher for data with a correlation coefficient closer to ± 1.
Difference Between Correlation and Regression
Describe relationships
Correlation describes the degree to which two variables are related.
Regression provides a method of finding the relationship between two variables.
Make predictions
Correlation simply describes how well two variables are related. Analyzing the correlation between two variables does not improve the accuracy with which the value of the dependent variable could be predicted for a given value of the independent variable.
Regression allows us to more accurately predict values of the dependent variable for a given value of the independent variable.
Dependency between variables
When analyzing the correlation , it does not matter which variable is independent and which variable is independent.
When analyzing the regression , it is necessary to distinguish between the dependent and the independent variables.
Image courtesy:
“Redesign File: Correlation_examples.png with vector graphics (SVG file)” by DenisBoigelot (own work, original uploader was Imagecreator ) [ CC0 1.0 ], via Wikimedia Commons