You can find good subroutines in GSL for multivariable nonlinear optimization, but the only method it provides for DFO is Nelder-Mead simplex method, which, claimed by Numerical Recipes, is “almost surely” slower than the Powell’s direct set method “in all likely applications”. You can find some DFO solvers here, but few of them are implemented in C/C++.
I have reimplemented a Hooke-Jeeve’s solver, adapted Powell’s direct set method from Numerical Recipes in C and ported NEWUOA solver from Fortran to C++ with f2c. Of the three solvers, Hooke-Jeeve’s method is the simplest. It simply searches the nearby regions around a given point and reduces radius in each iteration. It is purely heuristic and is hardly related to any “algorithm”. The Powell’s direct set method minimizes the objective function by minimizing along a direction using Brent’s method. It is believed to outperform Nelder-Mead method. NEWUOA is a much more advanced method. I do not know how it works, frankly. A benchmark shows that NEWUOA outperforms NMSMAX, an implementation of Nelder-Mead method, and APPSPACK as well. All the three solvers I implement work well on the problem I want to solve.
The three solvers are implemented as C++ template headers, one header file for each solver. The Hooke-Jeeve’s method is here and NEWUOA is here. I cannot release the source codes for Powell’s direct set as Numerical Recipes disallows this.