Lesson 3: Support Vector Machines

The term Support Vector Machines or SVM is a bit misleading. It is just a name for a very clever algorithm invented by two Russians. in the 1960s. SVMs are used for classification and regression.

SVM do that by finding a hyperplane between two classes of data which separates both classes best.

print("Start training")
t0 = time()
clf = svm.SVC(kernel="linear")
clf.fit(features_train, labels_train)
print("training time:", round(time() - t0, 3), "s")

print("start predicting")
t0 = time()
prediction = clf.predict(features_test)
print("predict time:", round(time() - t0, 3), "s")

# accuracy
print("Calculating accuracy")
accuracy = accuracy_score(labels_test, prediction)
print("Accuracy calculated, and the accuracy is", accuracy)

When timing the training of the SVC, it’s astonishing how long it takes: around 2.5 minutes at 98.4% accuracy.

As an alternative You can use:

clf = LinearSVC(loss='hinge')

It gets you a result in 0.3 seconds with the same accuracy.

What’s the difference?

Parameter tuning

with the initial SVC we can play around with the parameters “C” and “kernel”


Penalty parameter C of the error term

Kernels and the Kernel trick



Ingo’s Deep Dive


SVM Siraj Raval

Linear Digression Podcast on Kernel Trick