Suppose you have a distribution with 500 IQ scores, which are normally distributed with a mean of 100 and standard deviation of 15, and all scores are integers. Then one new data point is added which is far above any of the scores previously in your distribution by a considerable margin. For the following questions, report if there would be an increase, decrease, stay the same, or not enough information.
What happens to the value of the mean?