You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey - for context, @ClementineDomine asked me to have a brief look at the NeuralPlayground code base and possibly give you general (mostly packaging/docs) improvements pointers.
I read the README.md and had a look at the file architecture and I have some suggestions. Not all of these suggestions need to be addressed urgently, but it'd be nice to solve them in the long term.
It is very nonstandard to have a.egg-info folder contained as part of a git repository. This folder is the result of build commands, which typically do not interact with the source code of a repository, but are rather integrated in separate package publishing pipelines (CI scripts that publish to PyPI for instance, or conda recipes). Unless this file is updated at every commit that changes the structure of the repository, this .egg-info will also be out-of-sync with the current state of the repository, which can be the source of downstream bugs for users and developers.
The same comments hold for the build/lib/neuralplayground and the dist/ subfolders, which is also the output of build commands, and which are traditionally disentangled from source code mirrors such as git repositories. The source distributions can easily be recreated by git checkout-ing to the relevant git tag/branch and generating the source distribution from there.
Related to the two previous comments: in the setup instructions, you don't need that line: python setup.py sdist bdist_wheel in order to get a working editable install of NeuralPlayground. Indeed, this line creates a source distribution and a wheel file, which are typically needed when publishing a package to some package index like PyPA, but not as part of the code repository (which already plays the role of a source distribution, and which is not the right vessel to carry wheels). python -m pip install -e . is enough in your case. As a consequence, the line pip install --upgrade setuptools wheel should also be able to disappear.
I'm seeing two different requirements files: requirements.txt and requirements.yml. The requirements.yml file is not used anymore (it seems like it was removed here). Two suggested options: either remove it, or, better IMO, keep relying on it. conda is much more powerful than pip, as it is also able to install arbitrary binary dependencies, and not just python packages. Installing a python package using conda often limits the risk of dependency errors and facilitates the build process, to the point that many large-scale Python packages strongly recommend using conda and not pip. (see [1], [2], [3] for instance)
I noticed that NeuralPlaygrounds's requirements.txt files specifies very narrow versions constraints for most of its dependencies (e.g it hardcodes most of its dependencies version up to the bugfix release). While such a practice is nice from a reproducibility standpoint, hardcoding version numbers for every dependency of the project drastically increases the complexity of dependency solving which is a already a very hard [4] problem. For instance, what if some user uses this project alongside with other libraries that have incompatible requirements? Ideally, you could relax these constraints to the bare minimum to ensure that NeuralPlayground is easy to integrate as a part of a larger virtual environment.
There's quite a lot of documentation scattered around the codebase (either as part of README.md files within subfolders, or as docstrings). One nice thing NeuralPlayground could do would be to gather these pieces of documentation as part of a website that could be hosted on github pages and autogenerated by sphinx.
The text was updated successfully, but these errors were encountered:
pierreglaser
changed the title
Some comments on the installation instructions
Some high-level comments on NeuralPlaygroundMar 3, 2023
Hey - for context, @ClementineDomine asked me to have a brief look at the
NeuralPlayground
code base and possibly give you general (mostly packaging/docs) improvements pointers.I read the
README.md
and had a look at the file architecture and I have some suggestions. Not all of these suggestions need to be addressed urgently, but it'd be nice to solve them in the long term..egg-info
folder contained as part of a git repository. This folder is the result of build commands, which typically do not interact with the source code of a repository, but are rather integrated in separate package publishing pipelines (CI scripts that publish toPyPI
for instance, or conda recipes). Unless this file is updated at every commit that changes the structure of the repository, this.egg-info
will also be out-of-sync with the current state of the repository, which can be the source of downstream bugs for users and developers.build/lib/neuralplayground
and thedist/
subfolders, which is also the output of build commands, and which are traditionally disentangled from source code mirrors such asgit
repositories. The source distributions can easily be recreated bygit checkout
-ing to the relevantgit
tag/branch and generating the source distribution from there.python setup.py sdist bdist_wheel
in order to get a working editable install ofNeuralPlayground
. Indeed, this line creates a source distribution and a wheel file, which are typically needed when publishing a package to some package index like PyPA, but not as part of the code repository (which already plays the role of a source distribution, and which is not the right vessel to carrywheel
s).python -m pip install -e .
is enough in your case. As a consequence, the linepip install --upgrade setuptools wheel
should also be able to disappear.requirements
files:requirements.txt
andrequirements.yml
. Therequirements.yml
file is not used anymore (it seems like it was removed here). Two suggested options: either remove it, or, better IMO, keep relying on it.conda
is much more powerful than pip, as it is also able to install arbitrary binary dependencies, and not just python packages. Installing a python package usingconda
often limits the risk of dependency errors and facilitates the build process, to the point that many large-scalePython
packages strongly recommend usingconda
and notpip
. (see [1], [2], [3] for instance)NeuralPlaygrounds
'srequirements.txt
files specifies very narrow versions constraints for most of its dependencies (e.g it hardcodes most of its dependencies version up to the bugfix release). While such a practice is nice from a reproducibility standpoint, hardcoding version numbers for every dependency of the project drastically increases the complexity of dependency solving which is a already a very hard [4] problem. For instance, what if some user uses this project alongside with other libraries that have incompatible requirements? Ideally, you could relax these constraints to the bare minimum to ensure thatNeuralPlayground
is easy to integrate as a part of a larger virtual environment.README.md
files within subfolders, or as docstrings). One nice thingNeuralPlayground
could do would be to gather these pieces of documentation as part of a website that could be hosted ongithub pages
and autogenerated by sphinx.The text was updated successfully, but these errors were encountered: