Package schrodinger :: Package protein :: Module assignment :: Class ProtAssign :: Class hbond_cluster
[hide private]
[frames] | no frames]

Class hbond_cluster

Instance Methods [hide private]
 
__init__(self)
 
assign_combination(self, ct, icombination, add_labels, label_pkas)
 
create_hybrid(self, local_combinations, interact, random_scaffold=False)
This takes the lowest energy solution, and for each problematic region it searches other solutions (in random order) for any which may have had better luck for just that part of the overall cluster.
 
deconverge(self, ct, interact, comb, problem_cutoff=50.0)
This starts with what is assumed to be a good solution, and then randomizes the states, but not to anything that produces a problem.
 
expand_solutions(self, ct, interact)
This takes an existing set of good solutions and generates more by deconverging them and then iterating them back to convergence.
 
get_atom_name(self, ct, iatom)
 
get_residue_name(self, ct, iatom)
 
initialize_score_storage(self)
 
iterate_to_convergence(self, ct, interact, comb, problem_cutoff=50.0)
This iterates the combination 'comb' to convergence.
 
optimize(self, ct, interact, static_donors, static_acceptors, static_clashers, max_comb, use_propka, propka_pH=7.0, xtal_ct=None)
 
pre_score_pairs(self, ct, interact)
 
pre_score_self(self, ct)
 
recombine_solutions(self, ct, interact)
This is similar to score_sequentially, but begins with some pre-existing good solutions in self.combinations, and then creates hybrids to try to improve on them.
 
score_combination(self, ct, interact, states)
 
score_donor_acceptor(self, ct, donor_heavy, donor_hydrogen, acceptor_heavy, use_xtal=False)
 
score_donor_donor(self, ct, donor1_heavy, donor1_hydrogen, donor2_heavy, donor2_hydrogen, use_xtal=False)
 
score_exhaustively(self, ct, interact, find_all_solutions=True, tolerate_clashes=False)
 
score_pair(self, ct, iacceptors, idonors, iclashers, icharge, jacceptors, jdonors, jclashers, jcharge, use_xtal=False)
 
score_sequentially(self, ct, interact)
This routine uses an algorithm similar to Prime's iteration to convergence.
 
setup_local_static(self, ct, static_acceptors, static_donors, static_clashers)
 
setup_local_static_alt(self, ct, static_acceptors, static_donors, static_clashers)
 
setup_xtal(self, ct, interact, clustering_distance)
 
single_point(self, ct, interact, static_donors, static_acceptors, static_clashers, xtal_ct=None)
 
trim_redundant_combinations(self)
Method Details [hide private]

create_hybrid(self, local_combinations, interact, random_scaffold=False)

 

This takes the lowest energy solution, and for each problematic region it searches other solutions (in random order) for any which may have had better luck for just that part of the overall cluster. It then splices those solutions into the lowest energy one. If random_scaffold, then it selects a random solution as the basis in stead of the lowest energy one.

expand_solutions(self, ct, interact)

 

This takes an existing set of good solutions and generates more by deconverging them and then iterating them back to convergence. Generates at least 10 new solutions.

iterate_to_convergence(self, ct, interact, comb, problem_cutoff=50.0)

 

This iterates the combination 'comb' to convergence. Maximum of 10 cycles.

score_sequentially(self, ct, interact)

 

This routine uses an algorithm similar to Prime's iteration to convergence. Starting from a random configuration, each species is optimized in turn, keeping the others fixed in their current state. This continues until the system reaches convergence (no more changes in the most optimal state for all residues).