The rate of return on Your portfolio, R, has a mean value of 1% and a standard deviation of 5%. Suppose that (1 + R) is lognormally distributed.

1) Calculate at what level of rate of return y, the probability that R is less than or equal to y is equal to 10%.

2) Calculate the probability that R is greater than 10%.

P.S. I undarstand that the shape of a lognormall distribution is skeewed to the left. But do not understand how to solve this. Is there a trick to simplify this to a normal distribution? If so please help and explain it to me. Thank you.