Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run wheel builds selectively? #312

Open
jakirkham opened this issue Feb 25, 2022 · 10 comments · May be fixed by #563
Open

Run wheel builds selectively? #312

jakirkham opened this issue Feb 25, 2022 · 10 comments · May be fixed by #563

Comments

@jakirkham
Copy link
Member

Given the wheel CI builds take a while, wonder if we could limit how often we run these. For example:

  • Only run if certain files (like setup.py change)
  • Run only when requested (like with a PR label)
  • Run only on main and/or tags
  • Test some subset of wheel builds and only test all in specific contexts
  • ?

We need not do all or any of these. They are just some thoughts.

Curious what people think 🙂

@jakirkham
Copy link
Member Author

cc @henryiii (in case you have thoughts on what others do typically or on ways we might speed up the builds)

@joshmoore
Copy link
Member

Another option is cron, but for my part I'll add that I'm not great at keeping up with non-PR action failures. (And cronjobs get disabled after 6(?) months with no actions)

@jakirkham
Copy link
Member Author

Yeah and issues can pile up if not run regularly

Just trying to find a balance since it seems atm we are waiting 2hrs for the Linux wheel build, which is kind of long

@henryiii
Copy link
Contributor

henryiii commented Feb 25, 2022

I usually only run via workflow_dispatch or {github releases|tags} (depending on the project - I like releases, but sometimes tags are needed or better). For main / stable workflows, I'll have wheel builds on stable only. I'll sometimes add a very small subset of runs (one wheel each on each OS, for example) into the normal testing workflow. Generally wheel building is pretty expensive to serve as your main testing, traditional testing is better - (full) wheel builds should target releases. For example, see https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/.github/workflows/tests.yml#L89-L114.

And the full build is at https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/.github/workflows/wheels.yml

@henryiii
Copy link
Contributor

(this is also another reason to use configuration like https://github.com/scikit-hep/boost-histogram/blob/339306d388d7f112afb4e6f619c52ab8252c4f6a/pyproject.toml#L74-L86 instead of putting everything in environment variables)

@jakirkham
Copy link
Member Author

Thanks for the pointers here Henry! 😄

Looking more at the builds here. It seems we are covering other architectures that probably require emulation (like ppc64le and aarch64) in CI, 32-bit builds (Idk that we have tested there on CI or otherwise outside of the wheel builds) as well as other libc implementations (like musl).

Given the first 2 likely require emulation, we haven't actively tested on 32-bit, and there probably isn't much changing in Numcodecs that would be libc implementation sensitive, wondering if we could restrict regular builds to x86_64 with glibc and then only run the others when releasing. If we want additional coverage for things (like ppc64le and aarch64), would suggest additional jobs (outside of wheels) be added for them (ideally without emulation for better overall CI runtime).

Thoughts? 🙂

@jakirkham
Copy link
Member Author

One way to provide optionality here would be to allow labeling of PR like what cloudpickle does ( cloudpipe/cloudpickle#339 ). This would allow running additional tests when relevant.

@jakirkham
Copy link
Member Author

Going to start trimming the build matrix. Submitted PR ( #320 ) to do that. Will evaluate based on that whether more trimming or other changes are needed.

@jakirkham
Copy link
Member Author

A path check like this one in Dask would also work

@dstansby dstansby linked a pull request Aug 23, 2024 that will close this issue
3 tasks
@dstansby
Copy link
Contributor

I've opened a possible fix to this at #563

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants