Standard deviation of derived variable (x/y)*100

Hi all,

I would like a variable that is derived from two independent samples. For example:

Sample a
Sample size: 100
Mean (y): 67
Std Dev: 35

Sample b
Sample size: 100
Mean (x): 80
Std Dev: 45

I would like to derive a variable x/y*100 (i.e. x as percentage of y) but am unsure how to calculate the standard deviation.

Would anyone be able to help point me in the right direction?

Many thanks!