Easy neural networks with scikit


Easy to setup FCN for solving basic datasets like MNIST or dog/cat - should get about 85% accuracy after 1 round of training. This is intended to run in jupyter/collab notebooks, but everything will work form CLI save for the inline plots. Anaconda3 on Windows works just fine though. If you want to offload to GPU, tensorflow and keras is the way to go.

This is MNIST prediction after 1 training run. A few of the neurons are still kinda derpy but overall accuracy is 84%.

MNIST solution.PNG

import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import fetch_openml
%matplotlib inline

# grab dataset
mnist = fetch_openml('mnist_784')

# types

X = mnist['data']
y = mnist['target']

# inline print types - can uncomment to test
# print(X.dtype, y.dtype)
# print(X.shape, y.shape)

# plot mnist characters+answers inline (jupyter)
def plot_images(images, labels):
    n_cols = min(5, len(images))
    n_rows = len(images) // n_cols
    fig = plt.figure(figsize=(8, 8))

    for i in range(n_rows * n_cols):
        sp= fig.add_subplot(n_rows, n_cols, i+1)
        plt.imshow(images[i], cmap=plt.cm.gray)

p = np.random.permutation(len(X))
p = p[:20]
plot_images(X[p].reshape(-1, 28, 28), y[p])

# normalize data
y = y.astype("int32")
X = X / 255.0

# test normalization
X.min(), X.max()

# shape dataset
from sklearn.model_selection import train_test_split
train_X, test_X, train_y, test_y = train_test_split(X, y)

# test shape
train_X.shape, test_X.shape

# fit & start training
from sklearn.naive_bayes import MultinomialNB
cls = MultinomialNB()
cls.fit(train_X, train_y)

# eval
cls.score(test_X, test_y)

# view predictions from neurons
from sklearn.metrics import classification_report
predictions = cls.predict(test_X)
print(classification_report(test_y, predictions))

# plot final predictions
p = np.random.permutation(len(test_X))
p = p[:20]
plot_images(test_X[p].reshape(-1, 28, 28), predictions[p])
  • 2Like
Reactions: 1 users