Relationship between partial correlations and beta weights

Metaphor

New member
Joined
Aug 17, 2007
Messages
3
Hi,

Are beta weights in a multiple regression monotonic transformations of partial correlations? If not, why not?

If so, shouldn't the correlation of beta weights and partial correlations be 1.0?
 
The algebraic signs will be the same. Remember that correlation coefficients are indicators of linear relationships, not more general monotonic ones. Only two quantities related linearly will have a correlation coefficient equal to unity.

To see the bigger picture, I suggest you take a look at the following article. Among other things, it shows the precise relationship between the "betas" and partial correlation.
 
royhaas said:
The algebraic signs will be the same. Remember that correlation coefficients are indicators of linear relationships, not more general monotonic ones. Only two quantities related linearly will have a correlation coefficient equal to unity.

To see the bigger picture, I suggest you take a look at the following article. Among other things, it shows the precise relationship between the "betas" and partial correlation.

Can I ask where in that article beta weights are discussed as functions of partial correlations?

Can I also make it clear that I am talking about standardised regression coefficients. What else but the partial correlation would determine standardised regression coefficients?
 
Standardized regression coefficients, so-called, are the "unstandardized" coefficients, multiplied by the standard deviation of the predictor variable and divided by the standard deviation of the response variable. In the simplest case of one predictor, the standardized coefficient is the same as the correlation. The partial correlations enter into it of course in the more general cases. A partial correlation can be computed from the multiple correlation of two regressions, one containing all the variables and one containing all but the variables held constant.

The "beta weights", or standardized coefficients, do provide a "scale free" interpretation, but the multiple correlation needs to be considered as well, since that is the correlation between the predicted values and the response.

Each beta weight (or just regression coefficient) is proportional to a partial correlation, but the constant of proportionality differs from one predictor to another; therefore the correlation cannot be unity.
 
royhaas said:
Standardized regression coefficients, so-called, are the "unstandardized" coefficients, multiplied by the standard deviation of the predictor variable and divided by the standard deviation of the response variable. In the simplest case of one predictor, the standardized coefficient is the same as the correlation. The partial correlations enter into it of course in the more general cases. A partial correlation can be computed from the multiple correlation of two regressions, one containing all the variables and one containing all but the variables held constant.

The "beta weights", or standardized coefficients, do provide a "scale free" interpretation, but the multiple correlation needs to be considered as well, since that is the correlation between the predicted values and the response.

Each beta weight (or just regression coefficient) is proportional to a partial correlation, but the constant of proportionality differs from one predictor to another; therefore the correlation cannot be unity.

Thanks for this, I am going to have to go away and digest it.
 
Top