asked 117k views
3 votes
Suppose X,Y are continuous random variables with joint density f(x,y). Show that E[Y∣X∣ reduces to E[Y] if X, and Y are independent. Suppose X and Y are random variables in L², each with a mean zero. If X has nonzero variance and E[Y∣X]=cX for some constant c, rigorously determine c by applying the definition of conditional expectation.

asked
User Teecraft
by
8.2k points

1 Answer

2 votes

Final answer:

To show that E[Y|X] simplifies to E[Y] for independent X and Y, we use the fact that the joint density factors into the product of marginal densities. For finding the constant c in E[Y|X]=cX, we apply the definition of conditional expectation with E[Y] equals zero and solve for c using the variance of X.

Step-by-step explanation:

To show that E[Y|X] reduces to E[Y] if X, and Y are independent, we use the definition of independence and the properties of expected values. Independence implies that the joint density f(x,y) can be written as the product of the marginal densities f(x) and f(y), so f(x,y) = f(x)f(y). Consequently, the conditional expectation E[Y|X] is the integral of yf(y|x), which simplifies to the integral of yf(y) when X and Y are independent, yielding E[Y].

With X and Y in and means of zero, if E[Y|X]=cX for some constant c, we determine c by taking the expected value of both sides and using the definition of conditional expectation. This gives us E[Y] = E[E[Y|X]] = E[cX]. Since E[Y] is also zero, we can find c by solving cE[X²]=E[XY], where E[X²] is the variance of X, a nonzero quantity.

answered
User Hoaphumanoid
by
7.6k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.