How to improve a model that uses tabular data

Tips on how to improve your model for tabular data

Once you have a working model, the next important step is making systematic changes to the design to try to improve the performance.

12 - Target audience: Intermediate users
12 - Estimated time: It depends…​) on how much you want to improve your model

You will learn to
12 - Run multiple experiments.
12 - Test new ideas that may improve your model.

Note
This tutorial is a continuation of the Tabular data tutorial. If you do not yet have a working model that runs on tabular data, have a go at the tabular data tutorial and come back to this one once you have completed it.

Run several experiments and test new ideas

The Peltarion Platform allows you to duplicate an experiment with the click of a button. This makes it very easy to iterate and update your model efficiently and systematically.

There are many different possibilities for how to tweak a model you have built. Here is a collection of some of the most common ways in which you can improve the performance of the model:

Tab data numbers

Increase patience to train for more epochs

Take advantage of the speed of the training for tabulated data, increase the patience of your model. This will allow you to analyze the performance for more training epochs.

An epoch is when the model trains on every example in the training set once. The amount of time each epoch takes depends on the size of the training set and the complexity of the input data. Tabular data typically requires comparably less complex model architectures and therefore, training time for each epoch will be significantly faster than for models that run on images or text.

The patience of your model is the number of epochs that the model will wait for the performance to improve before early stopping.

The default setting for this is typically 5, which means that if the model trains for 5 consecutive epochs without achieving a lower validation loss (or any other metric you choose), the model will stop training.

How to increase patience

Navigate to the Settings section of the modeling view and increase the Patience value under Early stopping.


Change the model architecture

The architecture of your model is the general design, the different blocks you use, the number of layers and how many nodes in each layer, etc. Here are a few examples of how to improve your architecture.

Dense blocks - number of layers and number of nodes in each layer

  • Your first iteration should be relatively simple, i.e few layers with a lower number of nodes. If you feel like your model is struggling to make predictions you can try to increase the complexity of your architecture.

  • The more columns in your data and the more complicated the relationships between variables, the more complicated the architecture should be. You can try to increase the number of nodes per layer and then also the number of layers in your model.

    • Keep in mind that tabulated data is in general not very complicated and that it is unlikely that your model will require an excessively complicated architecture.

  • This is often a trial and error process. Make sure to be systematic in your approach. Do several experiments and do not change too many things in between experiments. This will help you to identify which changes are beneficial and which ones are not.

  • Batch normalization blocks learn how to scale and shift mini-batches to have zero mean and unit variance to make training more stable.

  • If you notice that your model performs significantly better on the training subset than the validation subset after a few epochs, it is likely that your model is overfitting.

    • Batch normalization can reduce the risk of overfitting and help the model perform better on new data that it has not yet seen.

  • Insert a batch normalization block after one of your dense blocks and run again to see if it helps reduce overfitting.

  • Dropout blocks randomly disable certain nodes from being trained in between epochs. This reduces the risk of the model relying excessively on a few parameters and learns in a more general way from all of the input data and hidden layers.

  • Dropout blocks are another way to reduce the risk of overfitting so you can try to insert dropout blocks similarly to the batch normalization block above to stop your model from overfitting.


Change the learning rate

The learning rate is a parameter that determines how big of a step each learning instance takes.

  • As mentioned previously, tabulated data is typically much less complex than other data types. This means that a lower learning rate will often perform better.

  • Try a few different orders of magnitude away from the default setting and look at the metric curves to see how it impacts performance.

    • In a nutshell, if your learning rate is too high, your model will struggle to settle on an optimum and your validation loss will increase again. You will see that your metric curves continue to vary a lot even after a large number of epochs.

    • If your learning rate is too low, your validation loss will decrease smoothly but slowly and you might need an excessive number of epochs to reach the same performance. You also run the risk of your model settling and not improving despite it not yet having reached its optimal performance.

How to change learning rate

Navigate to the Settings section in the Modeling view and change the Learning rate value.


Set a learning rate schedule

Another setting that is worth experimenting with is adding a learning rate schedule. This allows you to change the learning rate between epochs.

If you use exponential decay or linear decay, your model will make larger changes at the beginning of the training to get closer to an optimum performance and then decrease the learning rate to fine-tune.

  • A decreasing learning rate schedule is analogous to a golfer using a driver to get the ball closer to the hole and then changing to a putter for smaller, more precise strokes.

You can also have learning rate schedules such as triangular or reduce-on-plateau which have a warm-up period where the learning rate increases and then decreases. This helps reduce the risk of your model skewing badly towards a few clustered examples in the early training stages. This is particularly useful for highly differentiated data.

There are a few different learning rate schedules to choose from and this again is a trial and error process. Have a look at the learning rate schedule article to find out more about the details of the different options.

How to set learning rate schedule

Navigate to the Settings section in the Modeling view and change the Learning rate schedule.


Increase batch size

The batch size is the number of examples that are used in each training iteration. Increasing your batch size will allow the model to learn from more examples at a time which can improve training time and impact performance.

With a larger batch size, your model may require more epochs to reach the same results as with a lower batch size. As discussed earlier, this is not too much of a problem for tabulated data as the training time is quick so you can increase the number of epochs without much cost.

How to change learning rate

Navigate to the Settings section in the Modeling view and change the Batch size.

Was this page helpful?
YesNo