Python toolbox for performing gradient-free optimization
Nevergrad is a gradient-free optimization platform. The goals of this package are to provide: - gradient/derivative-free optimization algorithms, including algorithms able to handle noise. - tools to instrument any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete variables. - functions on which to test the optimization algorithms. - benchmark routines in order to compare algorithms easily.
$
pkg install py311-nevergradOrigin
math/py-nevergrad
Size
5.86MiB
License
MIT
Maintainer
sunpoet@FreeBSD.org
Dependencies
6 packages
Required by
0 packages