python - Plotting a decision boundary separating 2 classes using Matplotlib's pyplot -


i use tip me plotting decision boundary separate classes of data. created sample data (from gaussian distribution) via python numpy. in case, every data point 2d coordinate, i.e., 1 column vector consisting of 2 rows. e.g.,

[ 1   2 ] 

let's assume have 2 classes, class1 , class2, , created 100 data points class1 , 100 data points class2 via code below (assigned variables x1_samples , x2_samples).

mu_vec1 = np.array([0,0]) cov_mat1 = np.array([[2,0],[0,2]]) x1_samples = np.random.multivariate_normal(mu_vec1, cov_mat1, 100) mu_vec1 = mu_vec1.reshape(1,2).t # 1-col vector  mu_vec2 = np.array([1,2]) cov_mat2 = np.array([[1,0],[0,1]]) x2_samples = np.random.multivariate_normal(mu_vec2, cov_mat2, 100) mu_vec2 = mu_vec2.reshape(1,2).t 

when plot data points each class, this:

enter image description here

now, came equation decision boundary separate both classes , add plot. however, not sure how can plot function:

def decision_boundary(x_vec, mu_vec1, mu_vec2):     g1 = (x_vec-mu_vec1).t.dot((x_vec-mu_vec1))     g2 = 2*( (x_vec-mu_vec2).t.dot((x_vec-mu_vec2)) )     return g1 - g2 

i appreciate help!

edit: intuitively (if did math right) expect decision boundary red line when plot function...

enter image description here

your question more complicated simple plot : need draw contour maximize inter-class distance. fortunately it's well-studied field, particularly svm machine learning.

the easiest method download scikit-learn module, provides lot of cool methods draw boundaries : http://scikit-learn.org/stable/modules/svm.html

code :

# -*- coding: utf-8 -*-  import numpy np import matplotlib matplotlib import pyplot plt import scipy sklearn import svm   mu_vec1 = np.array([0,0]) cov_mat1 = np.array([[2,0],[0,2]]) x1_samples = np.random.multivariate_normal(mu_vec1, cov_mat1, 100) mu_vec1 = mu_vec1.reshape(1,2).t # 1-col vector  mu_vec2 = np.array([1,2]) cov_mat2 = np.array([[1,0],[0,1]]) x2_samples = np.random.multivariate_normal(mu_vec2, cov_mat2, 100) mu_vec2 = mu_vec2.reshape(1,2).t   fig = plt.figure()   plt.scatter(x1_samples[:,0],x1_samples[:,1], marker='+') plt.scatter(x2_samples[:,0],x2_samples[:,1], c= 'green', marker='o')  x = np.concatenate((x1_samples,x2_samples), axis = 0) y = np.array([0]*100 + [1]*100)  c = 1.0  # svm regularization parameter clf = svm.svc(kernel = 'linear',  gamma=0.7, c=c ) clf.fit(x, y) 

linear plot (taken http://scikit-learn.org/stable/auto_examples/svm/plot_svm_margin.html)


w = clf.coef_[0] = -w[0] / w[1] xx = np.linspace(-5, 5) yy = * xx - (clf.intercept_[0]) / w[1]  plt.plot(xx, yy, 'k-') 

enter image description here

multilinear plot (taken http://scikit-learn.org/stable/auto_examples/svm/plot_iris.html)


c = 1.0  # svm regularization parameter clf = svm.svc(kernel = 'rbf',  gamma=0.7, c=c ) clf.fit(x, y)  h = .02  # step size in mesh # create mesh plot in x_min, x_max = x[:, 0].min() - 1, x[:, 0].max() + 1 y_min, y_max = x[:, 1].min() - 1, x[:, 1].max() + 1 xx, yy = np.meshgrid(np.arange(x_min, x_max, h),                      np.arange(y_min, y_max, h))   # plot decision boundary. that, assign color each # point in mesh [x_min, m_max]x[y_min, y_max]. z = clf.predict(np.c_[xx.ravel(), yy.ravel()])  # put result color plot z = z.reshape(xx.shape) plt.contour(xx, yy, z, cmap=plt.cm.paired) 

enter image description here

implementation

if want implement yourself, need solve following quadratic equation: boundary equation

the wikipedia article

unfortunately, non-linear boundaries 1 draw, it's difficult problem relying on kernel trick there isn't clear cut solution.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -