The trick in this question is not to get intimidated by the terms mean and standard deviation.
Mean is the salary the guy at 50th percentile gets. So let us suppose we have 100 guys in the league and we make them stand in a row with increasing salary then the guy standing in the 50th position in the line get the mean salary.
So 50th guy in the line will get $10000 in 1960
So 50th guy in the line will get $1175000 in 2010.
Look at the curve below. One standard deviation means that the guys at 84.13 (50% +34.13%) earn more and and guys at 15.87% (50% - 34.13%) earn less by the amount of standard deviation.
Attachment:
bell curve.png [ 7.13 KiB | Viewed 3022 times ]
So in 1960 the guys earning $10000 were at 50% guys earning $9000 ($10000-$1000) were at the 15.87% and guys earning $8000 ($10000-2*$1000) were at 2.28%.
The question suggest the all the rookie players were in the last 2.28% of the population.
So apply the distribution to the new salaries we can say that rookies in 2010 were earning two standard deviations below the average.
So the rookie income in 2010 = mean income - 2 *standard devation= $1175000-$300000-$300000.=$575000