Package schrodinger :: Package application :: Package matsci :: Module mecp_mod :: Class BFGS
[hide private]
[frames] | no frames]

Class BFGS

object --+
         |
        BFGS

Manage a BFGS optimization.

Instance Methods [hide private]
 
__init__(self, c1=0.0001, c2=0.9, amax=50.0, amin=1e-08, xtol=1e-14, max_force=0.0005, max_iterations=50, eps=0.0001, init_hess='identity', verbose=False, logger=None)
Create an instance.
tuple
line_search_wolfe12(self, f, fprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None)
Same as line_search_wolfe1, but fall back to line_search_wolfe2 if suitable step length is not found, and raise an exception if a suitable step length is not found.
 
resetFiniteDiffCall(self)
Reset the finite difference call.
numpy.array
getInitialInvHessian(self, fun, jac, fun_0, jac_0, x_0)
Return an initial guess for the inverse Hessian.
scipy.optimize.optimize.OptimizeResult
minimize(self, fun, x_0, jac=None, **kwargs)
Minimization of a function using the BFGS algorithm.

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Class Variables [hide private]
  LARGE_RHO_K = 1000.0
  IMAG_TOL = 1e-06
Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, c1=0.0001, c2=0.9, amax=50.0, amin=1e-08, xtol=1e-14, max_force=0.0005, max_iterations=50, eps=0.0001, init_hess='identity', verbose=False, logger=None)
(Constructor)

 

Create an instance.

Parameters:
  • c1 (float) - parameter for Armijo condition rule
  • c2 (float) - parameter for curvature condition rule
  • amax (float) - maximum allowable step size
  • amin (float) - minimum allowable step size
  • xtol (float) - nonnegative relative tolerance for an acceptable step, the search exits with a warning if the relative difference between sty and stx is less than xtol where sty and stx define an interval
  • max_force (float) - maximum allowable force element
  • max_iterations (int) - the maximum number of iterations
  • eps (float) - step size in Angstrom for any finite difference approximations
  • init_hess (str) - the type of initial Hessian to use
  • verbose (bool) - specifies verbose logging
  • logger (logging.Logger or None) - output logger or None if there isn't one
Overrides: object.__init__

line_search_wolfe12(self, f, fprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None)

 

Same as line_search_wolfe1, but fall back to line_search_wolfe2 if suitable step length is not found, and raise an exception if a suitable step length is not found.

Parameters:
  • f (function) - function on which to perform the line search
  • fprime (function) - gradient of the function on which to perform the line search
  • xk (numpy.array) - point at which to start the search
  • pk (numpy.array) - search direction
  • gfk (numpy.array or None) - gradient of function evaluated at xk, if None is computed
  • old_fval (float or None) - function evaluated at xk, if None is computed
  • old_old_fval (float or None) - function evaluated at the point preceeding xk, or None if there isn't one
Returns: tuple

getInitialInvHessian(self, fun, jac, fun_0, jac_0, x_0)

 

Return an initial guess for the inverse Hessian.

Parameters:
  • fun (function) - function to minimize
  • jac (function) - the Jacobian of the function being minimized
  • fun_0 (float) - function value at initial solution
  • jac_0 (float) - the Jacobian value at initial solution
  • x_0 (numpy.array) - initial solution
Returns: numpy.array
the initial guess inverse Hessian (N/3 X 3)

minimize(self, fun, x_0, jac=None, **kwargs)

 

Minimization of a function using the BFGS algorithm.

Parameters:
  • fun (function) - function to minimize
  • x_0 (numpy.array) - initial solution
  • jac (function) - the Jacobian of the function being minimized
Returns: scipy.optimize.optimize.OptimizeResult
optimization parameters