Joint density distribution and Variance

#1
Hi everyone!

I was wondering if there is a way to calculate the joint distribution of two fully correlated variables, both with known distributions, expected value and variance, without knowing the conditional distribution?

If this is not possible, is there a way of finding \(Var(X,Y) = E[(XY)^2] - E[XY]^2\) when knowing that \(Cor(X,Y) = 1\)? I can't seem to find an expression for \( E[(XY)^2] \)...

Thanks!
 

BGM

TS Contributor
#2
When two variables are perfectly correlated, then they are linearly dependent. See, e.g.

http://en.wikipedia.org/wiki/Cauchy–Schwarz_inequality

Therefore their joint distribution is singular as you have \( Y = \alpha + \beta X \) for some constants \( \alpha, \beta \), and specifically \( \beta > 0 \) for \( Corr[X, Y] = 1 \)

And as a result, \( X, Y \) should be a location-scale transformation of each other as you know each of the marginal distribution. If that is not the case then there maybe some mistakes.