This repository has been archived by the owner on Jun 14, 2024. It is now read-only.
forked from Ciela-Institute/caustics
-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
test: Update tests to start using GPU device when available and skip …
…CPU (Ciela-Institute#164) * test: Start to update testing to use cpu and cuda * test: Add 'device' fixture with values based on cuda availability * fix: Add self return for FlatLambdaCDM.to method * test: Update test_base to use 'device' * test: Update test_batching to use 'device' * test: Update test_cosmology to use 'device' * test: Update test_epl to use 'device' * test: Update test_external_shear to use 'device' * test: Update test_interpolate_image to use 'device' * test: Update tests_jacobian_lens_equation to use 'device' * test: Move tensors to CPU first before converting to np.array Moves tensors '.to("cpu")' explicitely before '.numpy()' to avoid 'TypeError' from PyTorch: 'TypeError: can't convert cuda:0 device type tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first.' * fix: Fix base forward_raytrace with guesses not in same device * feat: Added 'core' module with 'sync_device' decorator Added 'core.py' module and a 'sync_device' decorator that can be used on any function to sync the device of the input tensors to the first non-cpu device it finds. * test: Update test_comoving_dist to cast to numpy * test: Add device to the rest of applicable test * test: Update to use missing device * test: Set device on plane params * revert: Remove 'sync_device' decorator * fix device batch_lm * test: Update .to('cpu') to .cpu() from code review * fix: Update tests/test_multiplane.py --------- Co-authored-by: Connor Stone <[email protected]>
- Loading branch information
1 parent
0fdb2c9
commit f2f02ce
Showing
25 changed files
with
321 additions
and
190 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,24 @@ | ||
import sys | ||
import os | ||
import torch | ||
import pytest | ||
|
||
# Add the helpers directory to the path so we can import the helpers | ||
sys.path.append(os.path.join(os.path.dirname(__file__), "utils")) | ||
|
||
CUDA_AVAILABLE = torch.cuda.is_available() | ||
|
||
|
||
@pytest.fixture( | ||
params=[ | ||
pytest.param( | ||
"cpu", marks=pytest.mark.skipif(CUDA_AVAILABLE, reason="CUDA available") | ||
), | ||
pytest.param( | ||
"cuda", | ||
marks=pytest.mark.skipif(not CUDA_AVAILABLE, reason="CUDA not available"), | ||
), | ||
] | ||
) | ||
def device(request): | ||
return torch.device(request.param) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.