Tuesday, January 23, 2024

Ex 1.10 - On mean and variance of two variables (Pattern Recognition and ML by Christopher Bishop)

Question:

Answer:

From the solutions to book

Using ChatGPT

If x and y are two independent continuous variables, can you derive the expression for E[x+y]

Certainly! If xx and yy are two independent continuous random variables, you can find the expected value of their sum E[x+y]E[x + y] by using the property that for independent random variables, the expected value of the sum is the sum of the expected values. Mathematically:

E[x+y]=E[x]+E[y]E[x + y] = E[x] + E[y]

So, the expected value of the sum of two independent continuous random variables is simply the sum of their individual expected values.

If you have specific probability density functions (PDFs) for xx and yy, you can also find E[x+y]E[x + y] by integrating over all possible values of xx and yy:

E[x+y]=(x+y)f(x)g(y)dxdyE[x + y] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} (x + y) \cdot f(x) \cdot g(y) \,dx \,dy

Here, f(x)f(x) and g(y)g(y) are the probability density functions of xx and yy respectively.

If xx and yy have known probability density functions, you can substitute them into the above expressions to find the expected value of their sum.

Please integrate the last expression and show how for i.i.d. x and y, E[x+y] = E[x] + E[y]

Certainly! Let's integrate the last expression and show that for independent and identically distributed (iid) random variables xx and yy, E[x+y]=E[x]+E[y]E[x + y] = E[x] + E[y].

Assuming f(x)f(x) and g(y)g(y) are the probability density functions (PDFs) of xx and yy respectively, and xx and yy are iid, the expression for E[x+y]E[x + y] is:

E[x+y]=(x+y)f(x)g(y)dxdyE[x + y] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} (x + y) \cdot f(x) \cdot g(y) \,dx \,dy

Now, let's perform the integration:

E[x+y]=(x+y)f(x)g(y)dxdyE[x + y] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} (x + y) \cdot f(x) \cdot g(y) \,dx \,dy

=xf(x)g(y)dxdy+yf(x)g(y)dxdy= \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} x \cdot f(x) \cdot g(y) \,dx \,dy + \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} y \cdot f(x) \cdot g(y) \,dx \,dy

Now, by the independence of xx and yy, the joint PDF f(x,y)f(x, y) can be expressed as the product of the individual PDFs:

f(x,y)=f(x)g(y)f(x, y) = f(x) \cdot g(y)

So, we can simplify the above expression:

=xf(x)(g(y)dy)dx+y(f(x)dx)g(y)dy= \int_{-\infty}^{\infty} x \cdot f(x) \left(\int_{-\infty}^{\infty} g(y) \,dy\right) \,dx + \int_{-\infty}^{\infty} y \left(\int_{-\infty}^{\infty} f(x) \,dx\right) \cdot g(y) \,dy

The integrals inside the parentheses evaluate to 1 because f(x)f(x) and g(y)g(y) are probability density functions, which integrate to 1 over their respective ranges. Therefore:

=xf(x)dx+yg(y)dy= \int_{-\infty}^{\infty} x \cdot f(x) \,dx + \int_{-\infty}^{\infty} y \cdot g(y) \,dy

This is equivalent to E[x]+E[y]E[x] + E[y], showing that for iid xx and yy, E[x+y]=E[x]+E[y]E[x + y] = E[x] + E[y].

No comments:

Post a Comment