Linear Regression with Multiple Variables

3 important questions on Linear Regression with Multiple Variables

Gradient Descent in practice II:

  • Should be decreasing after each iteration

  • Automatic convergence test: For example, declare convergence if decreases by less than in one iteration


  • Visualize the cost function to verify that gradient descent is evolving


  • For learning rate , try increasing by 3x than previous attempt, if convergence is slow. Ex: ....0.001, .003,0.01, 0.3,1..

How to adjust features to match graph of the data: Polynomial Regression

If data doesn't look linear, and if thru visualization, it looks like it fits a different graph

Comparison: Gradient Descent vs. Normal Equations

Assume m training examples, n features

  • Gradient Descent:
    • Need to choose  
    • Need many iterations
    • Works well even when n is large, complexity


  • Normal Equation:
    • No need to choose Alpha learning rate
    • Don't need to iterate
    • Need to compute the Inverse
    • Slow if n is very large, inversion has the complexity of


Rule of thumb: Good to move to gradient descent if n > 10,000

The question on the page originate from the summary of the following study material:

  • A unique study and practice tool
  • Never study anything twice again
  • Get the grades you hope for
  • 100% sure, 100% understanding
Remember faster, study better. Scientifically proven.
Trustpilot Logo