Line 13: | Line 13: | ||
---- | ---- | ||
---- | ---- | ||
− | == | + | ==Correlation and Covariance== |
− | + | Correlation and covariance are very similarly related. Correlation is used to identify the relationship of two random variables, X and Y. In order to determine the dependence of the two events, the correlation coefficient,<math> \rho </math>, is calculated as: | |
<math> \rho (X,Y) = \frac{cov(X,Y)}{ \sqrt{var(X)var(Y)} } </math> | <math> \rho (X,Y) = \frac{cov(X,Y)}{ \sqrt{var(X)var(Y)} } </math> | ||
− | + | Covariance is defined as: E(X-E[X])(Y-E[Y]))[1] | |
+ | Correlation is then defined as: E(XY) [2] | ||
− | If X and Y are independent of each other, that means they are uncorrelated with each other, or cov(X,Y) = 0. However, if X and Y are uncorrelated, that does not mean they are independent of each other. 1, -1, and 0 are the three extreme points <math>p\rho X,Y)</math> can represent. 1 represents that X and Y are linearly dependent of each other. In other words, Y-E[ | + | If X and Y are independent of each other, that means they are uncorrelated with each other, or cov(X,Y) = 0. However, if X and Y are uncorrelated, that does not mean they are independent of each other. 1, -1, and 0 are the three extreme points <math>p\rho X,Y)</math> can represent. 1 represents that X and Y are linearly dependent of each other. In other words, Y-E[Y] is a positive multiple of X-E[X]. -1 represents that X and Y are inversely dependent of each other. In other words, Y-E[Y] is a negative multiple of X-E[X]. [1] |
===Examples=== | ===Examples=== | ||
text | text | ||
+ | ==Autocorrelation and Autocovariance== | ||
+ | Correlation and covariance are comparing two random events. Autocorrelation and autocovariance are comparing the data points of one random event. | ||
+ | |||
+ | Autocorrelation is defined as : E(X\_{n1}-E[X1] | ||
---- | ---- | ||
==References== | ==References== | ||
+ | [1]: Ilya Pollak. General Random Variables. 2012. Retrieved from https://engineering.purdue.edu/~ipollak/ece302/SPRING12/notes/19_GeneralRVs-4_Multiple_RVs.pdf | ||
+ | [2]: Ilya Pollak. Random Signals. 2004. Retrieved from https://engineering.purdue.edu/~ipollak/ee438/FALL04/notes/Section2.1.pdf | ||
[[2013_Spring_ECE_302_Boutin|Back to ECE302 Spring 2013, Prof. Boutin]] | [[2013_Spring_ECE_302_Boutin|Back to ECE302 Spring 2013, Prof. Boutin]] |
Revision as of 19:29, 30 April 2013
Correlation vs Covariance
Student project for ECE302
by Blue
Contents
[hide]Correlation and Covariance
Correlation and covariance are very similarly related. Correlation is used to identify the relationship of two random variables, X and Y. In order to determine the dependence of the two events, the correlation coefficient,$ \rho $, is calculated as:
$ \rho (X,Y) = \frac{cov(X,Y)}{ \sqrt{var(X)var(Y)} } $
Covariance is defined as: E(X-E[X])(Y-E[Y]))[1] Correlation is then defined as: E(XY) [2]
If X and Y are independent of each other, that means they are uncorrelated with each other, or cov(X,Y) = 0. However, if X and Y are uncorrelated, that does not mean they are independent of each other. 1, -1, and 0 are the three extreme points $ p\rho X,Y) $ can represent. 1 represents that X and Y are linearly dependent of each other. In other words, Y-E[Y] is a positive multiple of X-E[X]. -1 represents that X and Y are inversely dependent of each other. In other words, Y-E[Y] is a negative multiple of X-E[X]. [1]
Examples
text
Autocorrelation and Autocovariance
Correlation and covariance are comparing two random events. Autocorrelation and autocovariance are comparing the data points of one random event.
Autocorrelation is defined as : E(X\_{n1}-E[X1]
References
[1]: Ilya Pollak. General Random Variables. 2012. Retrieved from https://engineering.purdue.edu/~ipollak/ece302/SPRING12/notes/19_GeneralRVs-4_Multiple_RVs.pdf [2]: Ilya Pollak. Random Signals. 2004. Retrieved from https://engineering.purdue.edu/~ipollak/ee438/FALL04/notes/Section2.1.pdf