Support for advanced optimizer parameters
Support for advanced optimizer parameters in the experiment settings panel, allowing for a more accurate optimizer function and the ability to control and adjust parameters to a greater extent. When in the Compiler options of the model builder, under the Optimizer function, there is now the ability to select Advanced options. There is a range of new optimizer functions you can set from Decay to Apply Amsgrad. Previously, the only parameter to choose here was Learning rate, with the standard learning rate being 0,001. But optimizer function methods have different learning rates, and therefore it is simply not plausible to apply the same learning rate across all methods – a distinction must be made. Users now have the ability to apply different learning rates across the various optimizer function methods.
These were the major updates and improvements added across recent sprints. We’ll continue to share updates on the progress of the Peltarion Platform every few weeks – so stay tuned!