Improve your model

Change run settings

For any kind of problem you can change one of these settings in the Run settings section in the Modeling view.

The combination of number of epochs, batch size and learning rate may affect training results.
Larger batch size allows for an increase in the learning rate and might lower training time. However, you might need more epochs to reach the same results as with a smaller batch size. On the other hand, consider reducing the learning rate when lowering the batch size.

  • Increase Patience in Early Stopping
    How? Increase Patience (default is 5 epochs), or skip it completely and let the training run the full amount of epochs.
    Early stopping is a feature that enables the training to be automatically stopped when a chosen metric has stopped improving.
    Why? A larger Patience means that an experiment will wait longer before stopping an experiment, so you can achieve better results.

  • Adjust the batch size
    How? Try to run your experiment with larger or smaller batch size. Commonly set sizes on our platform are 4, 8, 32, 64, 256, 512.
    The batch size is the number of examples that are processed in each training iteration and after which your model parameters are updated. The batch size influence the ability of the model to learn.
    Why? Changing batch size can help the optimization process.

  • Change the learning rate
    How? Reasonable values for the learning rate range from 0.1 to 10^-5. The learning rate can be lower but it should never be higher than 1.
    The learning rate is controlling the size of the update step along the gradient. With a small learning rate you can expect to make consistent but very small progress.
    Why? A too low learning rate means that you might get stuck at a local minimum not reaching the global minima, that is, the best result. A too high learning rate mean that you might miss the lowest value. Read more on learning rate and optimizers here.

  • Increase the number of epochs
    How? Increase Epochs (default is 100) and deselect Early Stopping.
    An epoch is when all training data have run through the model once.
    Why? You might get a better result if you let the experiment run for a longer time.

Change block parameters

You can also change the parameters in specific blocks to experiment with the model for better performance, for example, change Activation or number of Nodes in a Dense block.

Was this page helpful?
YesNo