You are here

Gradient Descent

4 posts / 0 new
Last post
Gradient Descent
#1

Hello,

 

I'm looking for gradient descent in pyrosetta. But, I can't find it. What is exactly the name of gradient descent in pyrosetta?

Category: 
Post Situation: 
Thu, 2020-10-01 16:02
nasim.soltani58

The minimizer is the Rosetta module that performs gradient-descent minimization.  In PyRosetta, this is most easily accessed using the MinMover (https://graylab.jhu.edu/PyRosetta.documentation/pyrosetta.rosetta.protocols.minimization_packing.html#pyrosetta.rosetta.protocols.minimization_packing.MinMover).

Thu, 2020-10-01 16:13
vmulligan

Thanks for your quick reply. Can I change this optimization method to something else like stochastic gradient descent?

Thu, 2020-10-01 16:18
nasim.soltani58

Rosetta currently doesn't have an option for stochastic gradient descent; since it's very rapid to compute the whole gradient vector, there was never any reason to  do partial gradients.  There are, however, several flavours,  most of which use different approximations of the inverse of the second-derivative Hessian matrix.  (True gradient descent using only gradients is implemented as the "linmin_iterated" minimization type, but this converges slowly and is recommended only for debugging.  The default type is "lbfgs_armijo_nonmonotone",  which is a quasi-Newtonian gradient descent methods that uses the low-memory Broyden–Fletcher–Goldfarb–Shannon algorithm to approximate the inverse of the Hessian matrix.  This converges more quickly).

Tue, 2020-10-06 15:13
vmulligan