Learn Before
Concept

Soft Margin Formulation

To relax the optimization problem for Hard-Margin SVM, we allow data points to live inside of the margin for the decision boundary, however we introduce a penalty term ξi\xi_i. We then reformulate the problem as follows: minθˉ,b,ξi12θˉ2+Ci=1Nξi\min_{\bar{\theta}, b, \xi_i} \frac{1}{2}||\bar{\theta}||^2 + C\sum_{i=1}^N\xi_i subject to yi(θˉxˉi+b)1ξi,ξi>0 i\text{subject to } y^i (\bar{\theta} \cdot \bar{x}^i + b) \geq 1 - \xi_i, \xi_i > 0 \ \forall i

You might notice that this formula is noticeably different than what is shown in the book. However, this reformulation will lead us down a very interesting rabbit hole.

0

2

Updated 2020-03-08

Tags

Data Science

Learn After