Machine Learning with a Multi Layer Perceptron

With all the hype about machine learning and artificial intelligence I thought it would be a good time to refresh my knowledge on the subject. Machine learning meets a lot of scepticism because it is mostly in the media for questionable tasks. Like all tools it can be used for evil things but it also has a lot of beneficial applications. I’m mostly interested in using it for computer graphics, games and tools.
I have implemented a multi layer perceptron in C++, using the Eigen template library for linear algebra. The code is quite simple, it allows to create a neural network with a variable number of layers and nodes per layer, to train the network, to query the network and to reverse query the network.

I started with a standard test: Recognition of handwritten numbers using the MNIST handwritten digit database. The dataset contains images of handwritten digits in a resolultion of 28×28 pixels. After having some problems with the network weights (their inital values were too big), I got it running and started some tests. The network I used had three layers, the input layer with one node per pixel, a hidden layer with just hundred nodes and the output layer with one node for each digit. I first trained the neural network with all training images multiple times and queried it afterwards.
From 10000 test images 97.05% were classified correctly. Given how simple the neural network was, this is pretty good in my opinion. However, for a real world application the error is still too big.
After that I made a reverse query, to see what image the neural network expects for each digit for a perfect match. Here are some images (enlarged):

 0
1
1
2
2
3
3
4
4
5
5
6
6
7
7
8
8
9
9

 

As a next step I plan to implement a convolutional neural network, but I’m also still looking for cool applications for it.