New to Peltarion? Discover our deep learning Platform

A single deep learning platform to build and deploy your projects, even if you’re not an AI superstar.

FIND OUT MORE

Evaluation view

In the Evaluation view you can see in real-time how the model is performing as it’s learning from the data.
Not learning as well as expected? Then pause the experiment, go back to the Datasets view or Modeling view, make some tweaks and get running again, quick iteration style.

Evaluation view elements

The Evaluation view is organized in 3 areas:

  1. The left side of the page shows all the experiments of your current project.
    Select an experiment from this list to focus the rest of the view on it. Experiments using a different loss function, which might not be directly comparable, will be marked by a lighter color.

  2. The Loss and metrics area shows information about the general performance of the model.
    The evolution of various metrics is plotted over time, giving you feedback on the training progress. Similar curves for other experiments that share the same loss function are also plotted, so that you can compare how well and how fast each version is training.

  3. The bottom area of the page lets you inspect the predictions of the model in more detail.
    You can interactively filter and look at the model output for individual examples. This gives you more insight into the model performance.

Evaluation view
Figure 1. Evaluation view