Pairwise independence vs. mutual independence

StatOp

New Member
I'm doing a PhD in statistical optics and I've stumbled upon a set of random variables that may or may not be mutually independent. I've managed to show that they are pairwise independent, but in general that doesn't imply they're mutually independent. Is there some conditions which guarantees that a set of pairwise independent variables also are mutually independent? I need mutual independence in order to express an expectation of a product as the product of expectations.

Dason

Ambassador to the humans
Are you working with theoretical random variables or are you deriving this from data?

hlsmith

Not a robit
So, they have a potential unique function that generates them or you are assuming this. Do you know the data generating process/function?

I can grab the cost of milk and a national growth rate and they can be correlated and possibly dependent - so I think if you are not using data generated in a vacuum or on your computer there are always issues of possible correlations or if sample sizes are small type I error concerns. Not aware of a test, but I wouldn't be surprised if there was something. There are tests to see if data are randomly created (e.g., ? benford), so testing similarities without a deep contextual knowledge of source seems possible but not conclusive. Sorry if I just ramble here without any great help.

Dason

Ambassador to the humans
Can you give us more details then? Do you have the joint pdf/cdf?

StatOp

New Member
I'll state the problem later tonight with all known information. Is it possible to write Latex code here?

Dason

Ambassador to the humans
If you wrap the latex in math tags it should parse how you want. For an example if I use a curly brace { instead of a square bracket it would look like {math} - but use the square brackets - and end your latex in {/math}. So for example the following is \frac{1}{2} wrapped in the math tags

$$\frac{1}{2}$$

And it's a javascript library that does the converting so you might need to reload the page after you post for it to actually parse correctly.

StatOp

New Member
Thanks. I need to check something before posting the problem.

Last edited:

StatOp

New Member
Got a bit sidetracked, but here's the problem I'm trying to solve:

Given a set of $$N^4$$ random variables $$\Gamma_{klpq}=(\theta_{kl}+\phi_{pq})$$ mod $$2\pi$$, where $$k,l,p,q$$ are integers and $$1 \le k,l,p,q \le N$$.

The following is known:
-Each of the $$N^2$$ random variables $$\theta_{kl}$$ and the $$N^2$$ random variables $$\phi_{pq}$$ has an individual probability distribution which is uniformly distributed over the interval $$[0, 2\pi[$$, and zero otherwise.
-The set of all $$\theta$$ and $$\phi$$ variables, $$\theta_{11}, \theta_{12},..., \theta_{NN}, \phi_{11}, \phi_{12},..., \phi_{NN}$$, are mutually independent.

I've managed to show that the set of all $$\Gamma$$ variables are pairwise independent. That is, any subset of two random $$\Gamma$$ variables, $$\Gamma_{klpq}$$ and $$\Gamma_{k'l'pq}$$ (or, equivalently $$\Gamma_{klpq}$$ and $$\Gamma_{klp'q'}$$), are independent. Note that if it wasn't for the modulo $$2\pi$$ in the definition of $$\Gamma_{klpq}$$, there would be no pairwise independence.

When looking at more than two random variables it's important to distinguish between pairwise independence and mutual independence. The latter is a stronger requirement, and allows us to express an expectation of a product as the product of expectations, which is what I need. Therefore, I want to find out whether the set of all $$\Gamma$$ variables, $$\Gamma_{1111}, \Gamma_{1112},..., \Gamma_{NNNN}$$, are mutually independent.

Last edited: