#### jaredcolbert1

##### New member

- Joined
- Dec 30, 2015

- Messages
- 1

You have a data set consisting of the sales prices of houses in your neighborhood, with each sale time-stamped by the month and year in which the house was sold. You want to predict the average value of houses in your neighborhood over time, so you fit a simple regression model with average house price as the output and the time index (in months) as the input. Based on 10 months of data, the estimated intercept is $4569 and the estimated slope is 143 ($/month). If you extrapolate this trend forward in time, at which time index (in months) do you predict that your neighborhood's value will have doubled relative to the value at month 10? (Round to the nearest month).

I have a feeling that I am misinterpreting what is being asked because it seems this is a simple extrapolation:

y = mx + b

y should be set equal to double the intercept (9138)

9138 = slope(x) + intercept

9138 = 143(x) + 4569

x = 4569 / 143

x = 31.95

Can anyone gently nudge me in the right direction as this answer is not correct.

Much appreciated!!