from sklearn.tree import DecisionTreeClassifier clf = DecisionTreeClassifier(min_samples_split=40) clf.fit(features_train, labels_train)

## UD120 – Intro to Machine Learning

One part of my bucket list for 2018 was finishing the Udacity Course UD120: Intro to Machine Learning.

the host of this course are Sebastian Thrun, ex-google-X and founder of Udacity and Katie Malone, creator of the Linear digressions podcast.

The course consists of 17 lessons. Every lesson has a couple of hours of video and lots and lots of quizzes in it.

- [x] Lesson 1: Only introduction ðŸ™‚
- [x] Lesson 2: Naive Bayes
- [x] Lesson 3: Support Vector Machines
- [x] Lesson 4: Decision Trees
- [x] Lesson 5: Choose your own algorithm
- [ ] Lesson 6: Datasets and questions
- [ ] Lesson 7: Regression
- Lesson 8: Outliers
- Lesson 9: Clustering
- Lesson 10: Feature Scaling
- Lesson 11: Text Learning
- Lesson 12: Feature Selection
- Lesson 13: PCA
- Lesson 14: Validation
- Lesson 15: Evaluation Metrics
- Lesson 16: Tying it all together
- Lesson 17: Final project

## Lesson 2: Naive Bayes

Lesson 2 of the Udacity Course UD120 – Intro to Machine Learning deals with Naive Bayes classification. Continue reading “Lesson 2: Naive Bayes”

## Lesson 3: Support Vector Machines

The term Support Vector Machines or SVM is a bit misleading. It is just a name for a very clever algorithm invented by two Russians. in the 1960s. SVM are used for classification and regression. Continue reading “Lesson 3: Support Vector Machines”