Lecture 28 - Support Vector Machines

Lecture 28: Support Vector Machines (SVMs)

1. Introduction

Support Vector Machines (SVMs) are supervised learning models used for classification and regression. They work by finding the optimal hyperplane that separates data points of different classes with the maximum margin.

2. Key Concepts

3. Types of Kernels

  1. Linear Kernel – Works for linearly separable data.
  2. Polynomial Kernel – Captures polynomial relationships.
  3. RBF Kernel – Good for circular/complex boundaries.
  4. Sigmoid Kernel – Similar to neural networks.

4. Examples

Example 1: Spam vs. Non-Spam Email Classification using word frequency as features.

Example 2: Cancer detection (Malignant vs. Benign) based on cell size and shape.

Example 3: Handwritten digit recognition (0–9) using image pixel features.

5. Applications

6. Advantages & Disadvantages

Advantages:

Disadvantages:

7. Interactive Calculator

SVM Margin Calculator (2D Case)

Equation of hyperplane: w1*x1 + w2*x2 + b = 0