- Install the latest version of the minimum supported version
- Presently our minimum supported version is 3.12
- We recommend you install this with pyenv for convenience of following these setup instructions, but you may install however you wish
- Here are the dev env setup steps using
pyenv
brew install pyenv
pyenv install 3.12
- this installs the latest version of
3.12
- this installs the latest version of
pyenv global 3.12
- this sets the system alias of
python
to version3.12
for your system - this is not required, but it is convenient for installing poetry on your system
- this sets the system alias of
pyenv local 3.12
- this creates a
.python-version
which poetry will depend upon for version and path resolution when creating your virtual environment
- this creates a
- Install poetry which will be our toolchain for managing all things python
poetry
will install to a global directory on your system. I recommend you set it using thePOETRY_HOME
env var - but that is not a requirementcurl -sSL https://install.python-poetry.org | POETRY_HOME=~/.poetry python -
(use python3 vs python if needed)echo 'export POETRY_HOME=~/.poetry' >> ~/.zshrc
echo 'export PATH="$POETRY_HOME/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
- If
poetry
has installed correctly then you should be able to runpoetry --version
- Create the virtual environment and install dependencies with
poetry
poetry install
- This creates a virtual env and installs all dependencies
poetry
does not require juggling virtual environments; you can simply use thepoetry
commands it will handle your virtualenv. However, if you need to debug something within the virtualenv, then you can runpoetry shell
to activate the virtualenv.
- PyCharm has an integration with
poetry
(and other IDE/editors presumably do too)
The makefile included in this repo provides a convenient shorthand for calling common poetry commands.
make ci
: runs linters, type checking, build docs, unit tests, etc - basically everything needed for CI except integration tests as a quick feedback loopmake ci-int
: runs everything that's needed for CI to pass including integration testsmake fmt
: runs black and isort formatters on all python codemake lint
: runs the linter against therunzero
package to check for CI criteriamake lint-all
: runs the linter against all python code in the repo; beyond what is required for CImake mypy
: runs the mypy type checker against therunzero
package to check for CI criteriamake mypy-all
: runs the mypy type checker against all python code in the repo; beyond what is required for CImake test
: runs unit testsmake test-int
: runs all (unit and integration) testsmake docs
: builds the sphinx docs locally from the SDKmake deptry
: runs deptry to analyze deps for issuesmake tox
: runs all tests under all supported python envs with toxmake tox-ci
: runs unit tests under all supported python envs with tox by installing the package and executing tests against what is builtmake tox-ci-int
: runs unit and integration tests under all supported python envs with tox by installing the package and executing tests against what is builtmake codegen-models
: runs the pydantic data-model code generator against the API specmake sync-deps
: updates poetry and syncs your current local deps with the current poetry lockfilemake init-test-config
: creates a test configuration template locally for overriding integration test configsmake hooks
: installs optional local git hooks to keep remote build surprises at bay
This SDK uses pytest to manage running its unit and integration tests.
The integration tests require configurations to provide a URL and various client secrets. For local development, be sure to run make init-test-config
and fill in the missing values for your local test instance.
Integration tests can be run using make test-int
for convenience or by calling pytest directly via poetry run pytest -m integration ./tests
if you desire to utilize specific pytest cli flags.
Poetry is an incredibly powerful toolchain and I recommend you read its docs, but the following will serve as a quick guide for onboarding.
-
Unlike
pip
,poetry
has the ability to manage multiple dependency groups so that a published library only includes the dependencies that it needs to run -
To add a dependency, it's as simple as
poetry add {lib}
- Poetry also offers a number of versioning restrictions like such:
poetry add {lib}@^2.0.5 # Allow >=2.0.5, <2.1.0 versions poetry add {lib}@~2.0.5 # Allow >=2.0.5 versions, without upper bound poetry add "{lib}>=2.0.5" # Allow only 2.0.5 version poetry add {lib}==2.0.5
-
To install a dev-only dependency, you need to use the
-G
flag and declare a dependency group (likedev
) alapoetry add -G dev {lib}
- Dependency groups can be used for all sorts of things like
test
,docs
,lint
,codegen
, etc
- Dependency groups can be used for all sorts of things like
- Removing dependencies (and any sub-deps) is very straight forward with
poetry
, you just need to runpoetry remove {lib}
- You can also remove dependencies from groups with the
-G
flag alapoetry remove -G dev {lib}
- You can also remove dependencies from groups with the
- If you want to synchronize your dependencies with the current lock file (ie after switching branches), then it's as simple as running
poetry install --sync
- This will remove any deps not found in the lockfile as well as downgrade/upgrade where appropriate
poetry
can manage running your formatters and linters for you as well by leveraging the settings in thepyproject.toml
file- It's as simple as running the
poetry run {cmd}
- To format with
black
:poetry run black ./runzero ./tests
- To format with
isort
:poetry run isort ./runzero ./tests
- To type check with
mypy
:poetry run mypy ./runzero
- To lint with
pylint
:poetry run pylint ./runzero
- To format with
- Consider auto-linting with the local git hook. Install with
make hooks
- First we need to install the codegen dependency
poetry install --with codegen
- Next, we can run the command to generate the pydantic models from the openapi spec
poetry run datamodel-codegen --input ./api/runzero-api.yml --field-constraints --output ./models/asset.py --target-python-version 3.8
To prepare a release by:
- Ensuring semver conventions by taking a 'major', 'minor' or 'patch' argument, or accepting a forced but standard, version string
- Bumping the project version
- Creating a standard release commit
- Creating the annotated release tag with a required committer signature
- Pushing to remote
Use ./script/prepare_release.sh
. Use -h flag for help / details. It will not execute in a dirty repo, and will back out
all changes on failure.