I have this code in Python:
def plot_decision_regions(X, y, classifier, resolution = 0.02):
markers = ('s', 'x', 'o', '^','v')
colors = ('red', 'blue', 'lightgreen', 'gray', 'cyan')
cmap = ListedColormap(colors[:len(np.unique(y))])
x1_min, x1_max = X[:, 0].min() -1, X[:,0].max() + 1
x2_min, x2_max = X[:, 1].min() -1, X[:,1].max() + 1
xx1, xx2= np.meshgrid (np.arange(x1_min, x1_max, resolution), np.arange(x2_min, x2_max, resolution))
Z = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]).T)
Z = Z.reshape(xx1.shape)
plt.contourf(xx1, xx2, Z, alpha= 0.3, cmap = cmap)
plt.xlim(xx1.min(), xx1.max())
plt.ylim(xx2.min(), xx2.max())
for idx, cl in enumerate (np.unique(y)):
plt.scatter (x=X[y == cl, 0], y= X[y == cl, 1], alpha=0.8, c=colors[idx], marker= markers [idx], label = cl, edgecolor = 'black')
Where X is a 100x2 vector with normal data (sepal and petla length for 2 kinds of flowers) , y is a 100x1 vector with only -1 and 1 values (class label vector) and Classifier = Perceptron . I don't know why I need to calculate the transpose
Z = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]).T)
What does
classifier.predict
and
x=X[y == cl, 0], y= X[y == cl, 1]
in
plt.scatter (x=X[y == cl, 0], y= X[y == cl, 1], alpha=0.8, c=colors[idx], marker= markers [idx], label = cl, edgecolor = 'black')
do?
I previously load a dataframe, define my predict method, define X and y
def predict(self,X):
'''Return class label after unit step'''
return np.where(self.net_input(X) >= 0.0, 1, -1)
And my class = Perceptron contains the w_ (weights) that are adjusted when iterating. Sorry if my english is not perfect
y = df.iloc[0:100 , 4] .values
y= np.where (y == 'Iris-setosa', -1, 1)
X= df.iloc[0:100, [0,2]].values

Xandyin the function call? What do they contain (i.e.:Xcontains data andythe class labels)?Xandyarray,colors, andmarkers.colorsandmarkersthey were defined in the code