Loading [MathJax]/jax/output/CommonHTML/jax.js
본문 바로가기

Coursera 강의 정리/Machine Learning - Andrew Ng

(3)
Week 3: Logistic Regression (feat. Regularization) Andres Ng 교수님의 Machine Learning 3주차 키워드 및 간단 정리입니다. Logistic Regression Hypothesis, Cost Function, Gradient Hypothesis hθ(x)=11+eθTx Cost Function: Crossentropy J(θ)=1mΣylog(hθ(x))+(1y)log(1hθ(x)) Gradient θj:=θjαδδθjJ(θ) $:= \theta_j - \alpha \frac{1}{m} \Sigma (h_\thet..
Week 2 Andrew Ng 교수님의 Machine Learning 강좌 주요 키워드 및 내용입니다. Linear Regression with Multiple Variables Hypothesis, Cost Function, and Gradient Hypothesis hθ(x)=θTx=θ0+θ1x1+θ2x2+...+θnxn Cost Function J(θ)=12mΣmi=1(hθ(x)y)2 Gradient $\theta_j := \theta_j - \alpha \frac{\delta}{\delta \theta_j}J(\theta) = \theta_j..
Week 1 Andrew Ng 교수님의 Machine Learning 강좌 주요 키워드 및 내용입니다. Introduction Definition of Machine Learning Learn without being explicitly programmed For task T, with lots of experience E, performance P would be better Categorization of Machine Learning Supervised Learning and Unsupervised Learning Regression and Classification Cocktail Party Problem Linear Regression with One Variable Univariate Linear Regr..