Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add doc strings to the neural networks architectures #630

Merged
merged 2 commits into from
Jul 19, 2024

Conversation

gcroci2
Copy link
Collaborator

@gcroci2 gcroci2 commented Jul 12, 2024

No description provided.

@gcroci2 gcroci2 changed the base branch from main to dev July 12, 2024 14:19
@gcroci2 gcroci2 linked an issue Jul 12, 2024 that may be closed by this pull request
@gcroci2 gcroci2 requested a review from DaniBodor July 12, 2024 14:19
Copy link
Collaborator

@DaniBodor DaniBodor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really nice that these functions have docstrings now.

I do think that these could be further improved, but not sure if we are willing to put the time in right now. Right now there is still no basis for a user to decide which network would be the most suitable for them.

Also, ideally, we would start adding the missing type annotations (for which the linting rules are currently suppressed in these modules). When I implemented ruff, I found that it was a lot of hassle to do so, and realize that this is still true. As above, not sure if it's worth spending more time figuring out what the types are for each element in each of these functions/classes.

deeprank2/neuralnets/gnn/alignmentnet.py Show resolved Hide resolved
deeprank2/neuralnets/gnn/alignmentnet.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/foutnet.py Show resolved Hide resolved
deeprank2/neuralnets/gnn/foutnet.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/ginet.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/naive_gnn.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/naive_gnn.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/naive_gnn.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/gnn/sgat.py Outdated Show resolved Hide resolved
deeprank2/neuralnets/cnn/model3d.py Show resolved Hide resolved
@gcroci2
Copy link
Collaborator Author

gcroci2 commented Jul 19, 2024

I do think that these could be further improved, but not sure if we are willing to put the time in right now. Right now there is still no basis for a user to decide which network would be the most suitable for them.

I agree, but since we have not yet implemented these networks and many lack detailed references, providing more specific information might be challenging. The key aspect for the user is understanding whether a network is suitable for regression, classification, or both, and this information is there. Given the lack of a one-size-fits-all architecture, experimenting with different networks is often necessary (more detailed descriptions and prior applications can of course be useful but they don’t guarantee optimal performance anyway).

Also, ideally, we would start adding the missing type annotations (for which the linting rules are currently suppressed in these modules). When I implemented ruff, I found that it was a lot of hassle to do so, and realize that this is still true. As above, not sure if it's worth spending more time figuring out what the types are for each element in each of these functions/classes.

While this would be of course a nice improvement, I don’t think it’s worth investing more time in this at this stage.

@gcroci2 gcroci2 merged commit 8a154d6 into dev Jul 19, 2024
7 checks passed
@gcroci2 gcroci2 deleted the 608_add_docs_nn branch July 19, 2024 10:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Docs - create descriptions/uses for the neural networks
2 participants