Scatter Plots And y=mx+b

br0okebleep

New member
Joined
Nov 16, 2020
Messages
2
So I'm doing scatter plots right now and one of the questions says:

Use the points (2001, 17.60) and (2002, 18.75) to write
the slope-intercept form of equation for the line of fit
shown in the scatter plot.

How do I find the equation of the line best of fit?
 
The problem says "Use the points (2001, 17.60) and (2002, 18.75) to writethe slope-intercept form of equation for the line of fit
shown in the scatter plot." That is not necessariy the "line of best fit".

The "slope- intercept form" is of the form y= ax+ b. Given the points (2001, 17.60) and (2002, 18.75) they must satisfy 17.60= a(2001)+ b and 18.75= a(2002)+ b. Solve those two equations for a and b. I suggest you start by subtracting the first equation from the second. That eliminates "b", leaving an equation to solve for a.

If you have more than two points, then calculate y for "x" in each (x,y) point given, in terms of a and b, and subtract the "y" for that point. The "least squares" error is given by squaring that difference, summing the squares. That will be a function of a and b and you calculate what a and b make that error least by differentiating with respect to a and b and setting those equal to 0. Again, that gives you two equations to solve for a and b.

But if the problem is just what you quote, to use just two points, you don't need to do that.
 

Attachments

  • 1605816347878.png
    1605816347878.png
    179.2 KB · Views: 3
Top