Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Examples not training well #3

Open
pyeguy opened this issue Jun 30, 2018 · 1 comment
Open

Examples not training well #3

pyeguy opened this issue Jun 30, 2018 · 1 comment

Comments

@pyeguy
Copy link

pyeguy commented Jun 30, 2018

I'm running python 3.6 and having some weird output from training the examples...
It seems like the lib was written in python 2.x but after wrapping a few range calls in list it seems like things are working (executing) fine except that the example input.cfg's are stopping very early (~12-20 epochs for the Ab-oct example) and generating some pretty horrendous models:
06-30-2018_01-31 train

Any thoughts on where to start trouble shooting?

Also what were the versions of Rdkit, Theano, and python this was written in?

@connorcoley
Copy link
Owner

That is a pretty atrocious model. If you're using the input .cfg files as they are, then I'm not sure why you would be seeing that. When I've seen cases like this before, it has been because the learning rate hyperparameter is way too high and the model collapses to local minimum where it just predicts the mean value.

These were written for Python 2.7.6. I believe the RDKit version was 2013.09.1, the Theano version was 0.9.0dev2.dev-58e93f9b94113d39fab7da6aefaf354cddead2e1, and Keras 1.1.0.

I don't know how more recent versions would affect the models besides causing syntax errors (which it sounds like you've fixed).

Do you mind trying a couple different hyperparameter settings for the Ab-oct model and seeing if they do any better?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants