Estimated time: 20 min

Deploy your number recognition model

How to solve a classification problem

Zero prior AI knowledge is required. We’re going to plainly rip the image into a long list of numbers and put it through a deep neural network. This is the simplest possible (and surprisingly powerful) starting point to deep learning, in our opinion.

Target group: Data scientist

Classification problems have always been around. Humans have in all times tried to classify the world around them. We try to make sense of the world by putting labels on things, this is a "car", this is a "truck", this is a "bus". But classification is hard, it’s not always easy to draw an exact dividing line between one class or another. Thanks to AI this has now become much easier. With increased computing power, big data and deep learning models we can now solve hard classification much faster and more accurately.

Images classifier app

The MNIST classification problem is the “Hello world” of deep learning. We're going to take it one step further and show how you can make a real-world AI web service out of this little example dataset.

You will learn how to

  • Build an AI model quick and easy
    Solve a classification problem. Create an experiment that predicts a label, in this case, what number an image depicts.
  • Use your AI model
    After training the model, you can deploy your experiment and start to use your AI application straight away.

The data

The original MNIST dataset consists of small, 28 x 28 pixel images of handwritten numbers that are annotated with a label indicating the correct number.

Examples from the MNIST dataset

The MNIST dataset we’re using in this tutorial consists of 3-channel RGB pictures, because then you can use the deployed experiment with a phone.
Important note! For a model to be usable, the input data needs to be of the same type as the model was trained on. In this case, the picture that depicts a number needs to be in RGB (a standard digital format.) This is true for every AI-model – it can’t predict apples when it has been trained on oranges.

Create a project

First, create a project and name it so you know what kind of project it is. Naming is important.

A project combines all of the steps in solving a problem, from the pre-processing of datasets to model building, evaluation, and deployment. Collaborators can be invited to the project.

Add the MNIST dataset to the platform

After creating the project, you will be taken to the Datasets view, where you can import data.

Click the Data library button and look for the MNIST - tutorial data dataset in the list. Click on it to get more information.

If you agree with the license, click Accept and import. This will import the dataset in your project, and you will be taken to the dataset's details where you can edit features and subsets.

Dataset features

The first samples of the MNIST dataset are shown in the Datasets view with one column for each feature, images, and numbers. On top of each feature, there is a graph showing the distribution of a feature over its range.

The Number column has the Encoding Categorical. By using this encoding, you ensure that you are not imposing a specific hierarchy. This is very important. 1 will not be treated as closer to 2 than 3 – they are all different numbers, different categories, with unique properties.

Subsets of the MNIST dataset

All samples in the dataset are, by default, split into 20% validation and 80% training subsets. Keep these default values in this project.

Save the dataset

You’ve now created a dataset ready to be used in the platform. Click Save version.

Click Use in new experiment to navigate to the Modeling view where you'll find the Experiment wizard.

Design a straightforward deep learning network for the MNIST dataset

It is now time to create an experiment in the Modeling view. An experiment contains all the information needed to reproduce the experiment:

  • The dataset
  • The AI model
  • The settings or parameters used to run the experiment

The result from this experiment is a trained AI model that can be evaluated and deployed.

Everything preset in the Experiment wizard

In the Experiment wizard, you'll see that all parameters are preset. The MNIST is selected in the Define dataset tab and the CNN snippet is selected in the Choose snippet tab.
A CNN is a basic Convolutional Neural Network, super to use when getting started with the platform. The CNN looks for low-level features such as edges and curves and then builds up more abstract concepts through a series of convolutional layers. If you want to read more, this is our article describing the CNN.

Click Create and the CNN will be added to the Modeling canvas.

CNN snippet

Run experiment

The experiment is done and ready to be trained. All settings, e.g., batch size, learning rate, target feature, have been pre-populated by the platform. The next step is simply to click Run. So let's do that!

Click Run in the top right corner to start the training.

Analyze experiment

Navigate to the Evaluation view and watch the model train. Let’s go through some graphs that shows the models at work.

Loss graph

Model evaluation view — Training overview

Here you can see the loss going lower by every epoch. The loss indicates the magnitude of error your model made on its prediction. It’s a method of evaluating how well your algorithm models your dataset. If your predictions are totally off, your loss function will output a higher number. If they’re pretty good, it’ll output a lower one. Is the loss low enough? Yes, this is good to go.

Confusion matrix

Model evaluation view — Confusion matrix

You can also get information from the confusion matrix. It is used to see how well a system does classification. The diagonal shows correct predictions, everything outside this diagonal is errors. In a perfect classification, you'll have 100% on the diagonal going from top left to bottom right.

Results ok – let’s deploy

In later tutorials, we will iterate on the experiment by tweaking the model to improve it. But this is good to go – time to deploy.

Deploy your trained experiment

This experiment is great, but it's of no use as long as it is locked up inside the Peltarion Platform. If you want people to use the trained experiment, you have to get it out in some usable form. This is where the Deployment view comes in. In this tutorial we will deploy the model as API.

Create new deployment
  1. In the Deployment, view click New deployment.
  2. Select experiment and checkpoint of your trained model to test it for predictions, or enable for business product calls. Both Best epoch and Last epoch for each trained experiment are available for deployment.
  3. Click the Enable switch to deploy the experiment.

Test the MNIST classifier

Click the Test deployment button. The following page will show up:

Add image to the test classifier

Download this image of a handwritten number five:

Drop the image in the classifier and click the Result icon to get a result.

If an error occurs, make sure that the uploaded image has three channels (RGB) and the size 28x28 pixels.

Result - Success!!

Whazaaaaam!!! You have created an operational AI experiment.

Not working? Remember it's a prediction

If it doesn’t work every time, remember that the loss from the experiment isn’t 0. Hence, the experiment will predict the wrong numbers in some cases.

Tutorial recap and next steps

In this tutorial you’ve created an AI experiment that you trained, evaluated, and deployed. You have used all the tools you need to go from data to production — easily and quickly.

You've used one labeled dataset and a CNN to predict numbers, but can you improve a result by using multiple sets of input data, both tabular data and images. Try this in the tutorial Predict California house prices. Predicting a house price from both tabular and image inputs is a unique problem and not something you can do with anything other than deep learning.

Use web app for more things

The web app can be used for any image classifier, not only predicting numbers. Save the link and use the web app for later image classifying projects.