-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Storage of (large) test data #277
Comments
In the first instance thin down and yes its definitely a "wrong" approach but I think it works for now. We should have an issue to explore alternatives. IMO I can't see a reason these would need to be this big |
I'm happy with closing this issue on thinning then adding another issue for alternatives. |
I just looked and it seems like we could largely replace this approach by running a model fit in setup.R? |
Note that |
The downside is that when I am doing things locally I run |
but surely any fit we need for a test is going to be <30 seconds? There really doesn't seem like a need for more? |
The Gamma one is not <30 seconds currently. And If we are intending to do "parameter recovery" integration tests then they can't be bad fits. |
That doesn't seem ideal and it feels like the model should be workable with 30 seconds a core so I find this surprising? |
I might not have been setting cores as argument. I can post here what the actual runtimes are. |
Now for some data sets used in tests we are creating them with
inst/generate_examples.R
then storing them ininst/extdata
. At the moment this isfit.rds
andfit_gamma.rds
. These files are above the 50.00 MB that GitHub recommends as maximum file size.One option is to thin down these fits to make them smaller.
Another option is that this is the wrong approach and we should rethink where we store data for tests / how we are approaching this somehow.
The text was updated successfully, but these errors were encountered: