# Method of Least Squares for Multiple Regression Detailed

Theorem 1: The regression line has form

where the coefficients bm are the solutions to the following k equations in k unknowns.

Proof: Our objective is to find the values of the coefficients bi for which the sum of the squares

is minimum where ŷi is the y-value on the best fit line corresponding to xi1,…,xik. Now,

For any given values of (x11, …, x1k, y1), …, (xn1, …, xnk, yn), this expression can be viewed as a function of the bi, namely g(b0, …, bk):

By calculus the minimum value occurs when the partial derivatives are zero. i.e.

Transposing terms we have

Further simplifying

But since $\sum\nolimits_{i=1}^n (x_{im}-\bar{x}_m)$ = 0, the last equation becomes

The remaining k equations are:

These are equivalent to

Since we have k equations in k unknowns (the bm), there can be a solution.