Efficient math optimization C++ library
ensmallen provides a simple set of abstractions for writing an objective
function to optimize. It also provides a large set of standard and
cutting-edge optimizers that can be used for virtually any mathematical
optimization task. These include full-batch gradient descent techniques,
small-batch techniques, gradient-free optimizers,
and constrained optimization.
- Developed at devel:libraries:c_c++
- Sources inherited from project openSUSE:Factory
-
2
derived packages
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout openSUSE:Factory:zSystems/ensmallen && cd $_
- Create Badge
Refresh
Refresh
Source Files
Filename | Size | Changed |
---|---|---|
ensmallen-2.19.0.tar.gz | 0001328515 1.27 MB | |
ensmallen.changes | 0000002919 2.85 KB | |
ensmallen.spec | 0000002398 2.34 KB |
Revision 4 (latest revision is 7)
Dominique Leuenberger (dimstar_suse)
accepted
request 1040059
from
Dirk Mueller (dirkmueller)
(revision 4)
- update to 2.19.0: * Added DemonSGD and DemonAdam optimizers * Fix bug with Adam-like optimizers not resetting when `resetPolicy` is `true`. * Add Yogi optimizer * Add AdaBelief optimizer * Add AdaSqrt optimizer * Bump check for minimum supported version of Armadillo * Update Catch2 to 2.13.8 * Fix epoch timing output * Accelerate SGD test time * Fix potential infinite loop in CMAES * Fix SCD partial gradient test * Add gradient value clipping and gradient norm scaling callback * Remove superfluous CMake option to build the tests * Bump minimum Armadillo version to 9.800 * Update Catch2 to 2.13.7 * Remove redundant template argument for C++20 compatibility * Fix MOEAD test stability
Comments 0