Package schrodinger :: Package application :: Package matsci :: Module mecp_mod :: Class BFGS
[hide private]
[frames] | no frames]

Class BFGS

object --+
         |
        BFGS

Manage a BFGS optimization.

Instance Methods [hide private]
 
__init__(self, c1=0.0001, c2=0.9, amax=50.0, amin=1e-08, xtol=1e-14, max_force=0.0001, max_iterations=50, eps=0.0001, init_hess='identity', verbose=False, logger=None)
Create an instance.
 
line_search_wolfe1(self, f, fprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None, args=(), c1=0.0001, c2=0.9, amax=50, amin=1e-08, xtol=1e-14)
As `scalar_search_wolfe1` but do a line search to direction `pk`
 
scalar_search_wolfe1(self, phi, derphi, phi0=None, old_phi0=None, derphi0=None, c1=0.0001, c2=0.9, amax=50, amin=1e-08, xtol=1e-14)
Scalar function search for alpha that satisfies strong Wolfe conditions
 
resetFiniteDiffCall(self)
Reset the finite difference call.
numpy.array
getInitialInvHessian(self, fun, jac, fun_0, jac_0, x_0)
Return an initial guess for the inverse Hessian.
scipy.optimize.optimize.OptimizeResult
minimize(self, fun, x_0, jac=None, **kwargs)
Minimization of a function using the BFGS algorithm.

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Class Variables [hide private]
  LARGE_RHO_K = 1000.0
  IMAG_TOL = 1e-06
  ANGLE_TOL = 0.01
  NEG_HESS_TOL = -0.01
Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, c1=0.0001, c2=0.9, amax=50.0, amin=1e-08, xtol=1e-14, max_force=0.0001, max_iterations=50, eps=0.0001, init_hess='identity', verbose=False, logger=None)
(Constructor)

 

Create an instance.

Parameters:
  • c1 (float) - parameter for Armijo condition rule
  • c2 (float) - parameter for curvature condition rule
  • amax (float) - maximum allowable step size
  • amin (float) - minimum allowable step size
  • xtol (float) - nonnegative relative tolerance for an acceptable step, the search exits with a warning if the relative difference between sty and stx is less than xtol where sty and stx define an interval
  • max_force (float) - maximum allowable force element
  • max_iterations (int) - the maximum number of iterations
  • eps (float) - step size in Angstrom for any finite difference approximations
  • init_hess (str) - the type of initial Hessian to use
  • verbose (bool) - specifies verbose logging
  • logger (logging.Logger or None) - output logger or None if there isn't one
Overrides: object.__init__

line_search_wolfe1(self, f, fprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None, args=(), c1=0.0001, c2=0.9, amax=50, amin=1e-08, xtol=1e-14)

 

As `scalar_search_wolfe1` but do a line search to direction `pk`

Parameters
----------
f : callable
    Function `f(x)`
fprime : callable
    Gradient of `f`
xk : array_like
    Current point
pk : array_like
    Search direction

gfk : array_like, optional
    Gradient of `f` at point `xk`
old_fval : float, optional
    Value of `f` at point `xk`
old_old_fval : float, optional
    Value of `f` at point preceding `xk`

The rest of the parameters are the same as for `scalar_search_wolfe1`.

Returns
-------
stp, f_count, g_count, fval, old_fval
    As in `line_search_wolfe1`
gval : array
    Gradient of `f` at the final point

scalar_search_wolfe1(self, phi, derphi, phi0=None, old_phi0=None, derphi0=None, c1=0.0001, c2=0.9, amax=50, amin=1e-08, xtol=1e-14)

 

Scalar function search for alpha that satisfies strong Wolfe conditions

alpha > 0 is assumed to be a descent direction.

Parameters
----------
phi : callable phi(alpha)
    Function at point `alpha`
derphi : callable dphi(alpha)
    Derivative `d phi(alpha)/ds`. Returns a scalar.

phi0 : float, optional
    Value of `f` at 0
old_phi0 : float, optional
    Value of `f` at the previous point
derphi0 : float, optional
    Value `derphi` at 0
amax : float, optional
    Maximum step size
c1, c2 : float, optional
    Wolfe parameters

Returns
-------
alpha : float
    Step size, or None if no suitable step was found
phi : float
    Value of `phi` at the new point `alpha`
phi0 : float
    Value of `phi` at `alpha=0`

Notes
-----
Uses routine DCSRCH from MINPACK.

getInitialInvHessian(self, fun, jac, fun_0, jac_0, x_0)

 

Return an initial guess for the inverse Hessian.

Parameters:
  • fun (function) - function to minimize
  • jac (function) - the Jacobian of the function being minimized
  • fun_0 (float) - function value at initial solution
  • jac_0 (float) - the Jacobian value at initial solution
  • x_0 (numpy.array) - initial solution
Returns: numpy.array
the initial guess inverse Hessian (N/3 X 3)

minimize(self, fun, x_0, jac=None, **kwargs)

 

Minimization of a function using the BFGS algorithm.

Parameters:
  • fun (function) - function to minimize
  • x_0 (numpy.array) - initial solution
  • jac (function) - the Jacobian of the function being minimized
Returns: scipy.optimize.optimize.OptimizeResult
optimization parameters