Download presentation
Presentation is loading. Please wait.
Published byOsborn Hall Modified 6년 전
1
Multiple features Linear Regression with multiple variables (다변량 선형회귀)
Machine Learning
2
다수의 특징들 (변수: variables).
Size (feet2) Price ($1000) 2104 460 1416 232 1534 315 852 178 …
3
Multiple features (variables).
Size (feet2) Number of bedrooms Number of floors Age of home (years) Price ($1000) 2104 5 1 45 460 1416 3 2 40 232 1534 30 315 852 36 178 … Notation: = number of features = input (features) of training example. = value of feature in training example. Pop-up Quiz
4
Hypothesis: Previously:
5
For convenience of notation, define .
Multivariate linear regression.
6
Gradient descent for multiple variables
Linear Regression with multiple variables Gradient descent for multiple variables Machine Learning
7
Hypothesis: Parameters: Cost function: Gradient descent: Repeat
(simultaneously update for every ) Repeat Gradient descent:
8
Gradient Descent New algorithm : Repeat Previously (n=1): Repeat
(simultaneously update for ) (simultaneously update )
9
Gradient descent in practice I: Feature Scaling
Linear Regression with multiple variables Gradient descent in practice I: Feature Scaling Machine Learning
10
Idea: 특징들이 유사한 스케일이 되게 하는 것.
Feature Scaling Idea: 특징들이 유사한 스케일이 되게 하는 것. E.g = size ( feet2) = number of bedrooms (1-5) size (feet2) number of bedrooms
11
특징 크기조정(Feature Scaling)
모든 특징들이 대략적으로 구간이 되게 ..
12
평균 정규화(Mean normalization)
를 로 교체 <= 특징들의 평균이 대략적으로 0 이 되도록 (Do not apply to ). E.g.
13
Gradient descent in practice II: Learning rate
Linear Regression with multiple variables Gradient descent in practice II: Learning rate Machine Learning
14
“Debugging”: 어떻게 경사하강을 제대로 작동되게 할 수 있나
Gradient descent “Debugging”: 어떻게 경사하강을 제대로 작동되게 할 수 있나 학습률 를 어떻게 선택하는가? .
15
Making sure gradient descent is working correctly.
Example automatic convergence test: Declare convergence if decreases by less than in one iteration. No. of iterations
16
Making sure gradient descent is working correctly.
Gradient descent not working. Use smaller . No. of iterations No. of iterations No. of iterations For sufficiently small , should decrease on every iteration. But if is too small, gradient descent can be slow to converge.
17
Summary: 가 너무 작으면: 천천히 수렴. 가 너무 크면 : 가 반복 때마다 감소하지 않을 수 있고; 수렴 안할 수도 있다. To choose , try
18
Features and polynomial regression
Linear Regression with multiple variables Features and polynomial regression Machine Learning
19
집 값 예측(Housing prices prediction)
20
다항식 회귀(Polynomial regression)
Price (y) Size (x)
21
특징 선택(Choice of features)
Price (y) Size (x)
22
정규방정식 (Normal equation)
Linear Regression with multiple variables 정규방정식 (Normal equation) Machine Learning
23
Gradient Descent Normal equation: 를 해석적으로(analytically) 푸는 방법.
24
Intuition: If 1D (for every ) Solve for
25
Examples: Size (feet2) Number of bedrooms Number of floors Age of home (years) Price ($1000) 1 2104 5 45 460 1416 3 2 40 232 1534 30 315 852 36 178 Size (feet2) Number of bedrooms Number of floors Age of home (years) Price ($1000) 2104 5 1 45 460 1416 3 2 40 232 1534 30 315 852 36 178 Pop-up Quiz
26
examples ; features. E.g. If
27
is inverse of matrix Octave: pinv(X’*X)*X’*y
28
training examples, features.
Gradient Descent Normal Equation Need to choose . Needs many iterations. No need to choose . Don’t need to iterate. Works well even when is large. Need to compute Slow if is very large.
29
Normal equation and non-invertibility (optional)
Linear Regression with multiple variables Normal equation and non-invertibility (optional) Machine Learning
30
What if is non-invertible? (singular/ degenerate)
Normal equation What if is non-invertible? (singular/ degenerate) Octave: pinv(X’*X)*X’*y
31
What if is non-invertible?
Redundant features (linearly dependent). E.g size in feet2 size in m2 Too many features (e.g ). Delete some features, or use regularization.
Similar presentations