본문 바로가기

Coursera 강의 정리/Machine Learning - Andrew Ng

Week 3: Logistic Regression (feat. Regularization)

Andres Ng 교수님의 Machine Learning 3주차 키워드 및 간단 정리입니다.

Logistic Regression

Hypothesis, Cost Function, Gradient

  • Hypothesis
    $h_\theta(x) = \frac{1}{1+e^{-\theta^{T}x}}$
  • Cost Function: Crossentropy
    $J(\theta) = -\frac{1}{m}\Sigma ylog(h_\theta(x)) + (1-y)log(1-h_\theta(x))$
  • Gradient
    $\theta_j := \theta_j - \alpha\frac{\delta}{\delta \theta_j}J(\theta)$
    $:= \theta_j - \alpha \frac{1}{m} \Sigma (h_\theta(x) - y)x_j$

Decision Boundary

Muti Class Classification

  • One vs All Logistic Regression

Regularization

Prevent Overfitting

  • Reduce Features
  • Regularization

Regularization in Cost Function

  • $J(\theta) = -\frac{1}{m}\Sigma ylog(h_\theta(x)) + (1-y)log(1-h_\theta(x)) + \frac{\lambda}{2m}\Sigma \theta ^2 $

'Coursera 강의 정리 > Machine Learning - Andrew Ng' 카테고리의 다른 글

Week 2  (0) 2020.08.03
Week 1  (0) 2020.08.02