Introduction
Learning x à
y relationship
Case study: predicting house prices
Module 1: Simple Regression
Gradient descent algorithm
Module 2: Multiple Regression
Incorporate more inputs
Module 3: Assessing Performance
Overfit
Measures of error (training, test, true)
Bias-variance tradeoff
Module 4: Ridge Regression
In addition to measure fit, ask how to choose balance (i.e.
model complexity)
Cross validation
Module 5: Feature Selection & Lasso Regression
Efficiency of predictions and interpretability
Lasso total cost = measure of fit + (different) measure of
model complexity
Coordinate descent algorithm (very cool, maybe useful in
tuning coil)
Module 6: Nearest Neighbor & Kernel Regression
Models, Algorithms, Concepts, very important course
Assumed background: Basic calculus (concept of derivatives),
basic linear algebra (vectors, matrices, matrix multiply, important to learn before IBSC program started); programming
experience: python; Reliance on GraphLab Create, SFrames, Assignments: use
pre-implemented algorithms first, then implement all algorithms from scratch,
via Numpy library
Net result: learn how to code methods from scratch
Provided machine in Cloud
沒有留言:
張貼留言