Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hv check test suite #120

Merged
merged 41 commits into from
Jul 15, 2024
Merged

Hv check test suite #120

merged 41 commits into from
Jul 15, 2024

Conversation

HansVRP
Copy link
Collaborator

@HansVRP HansVRP commented Jun 10, 2024

I have included a unit test for the patch extractions.

When disabling the integration tests, the current suite runs in 22 s

@HansVRP
Copy link
Collaborator Author

HansVRP commented Jun 10, 2024

@GriffinBabe @VictorVerhaert

Please take a look and let me know what you think.

The changes should be in test_unit_patch_extractors

Copy link
Collaborator

@VictorVerhaert VictorVerhaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding skip to all the current integration tests seems dangerous to me, as development of new features might not be tested properly while we are writing the unit tests for this functionality.
Perhaps we can "simply" replace the execution on openeo with a check on the PG, or even just doe a pre-flight check contained in the python client, so we don't fully lose these tests.

Furthermore it is not entirely clear to me what the tests in test_unit_patch_extractors is supposed to test. In general test files should follow the convention of test_<file_to_be_tested>.py so unit tests clearly point to the code being tested.
In this case the file would be test_feature_extractor.py with perhaps a class combining the tests for the path extractors.



# Helper functions to test excecute
def create_mock_common_preparations():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be a constant, perhaps it is easier to create it as a constant; or even better as a pytest fixture?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will set it as a fixture, it does need to be a function

return xr.DataArray(data, dims=["bands", "t", "y", "x"])


def create_mock_rescale_s1():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be a constant, perhaps it is easier to create it as a constant; or even better as a pytest fixture?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will set it as a fixture, it does need to be a function



# test excecute
@patch.object(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand correctly, patch.object is supposed to mock the output of one function, in case you don't want to mock an entire object. Here you are mocking a function of a dummy object. I think it is more clear to just adjust your dummy object.
Source: https://realpython.com/python-mock-library/#patching-an-objects-attributes

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will take a look

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this test I mock the actual functions in the actual PatchExtractor.

However, be defining these in the DummyPatchExtractor, you will no longer be able to test the behavior of the real functionalities through the Dummy class

"_rescale_s1_backscatter",
return_value=create_mock_rescale_s1(),
)
def test_execute(mock_common_preparations, mock_rescale_s1):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does this test test exactly? I don't see any gfmap functionality being used

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it tests whether execute actually calls common_preperations and rescale_s1.

It is similar to testing whether a process graph remained the same

@HansVRP
Copy link
Collaborator Author

HansVRP commented Jun 20, 2024

test_feature extractor already exists (these are currently integration tests).

For now I will opt to add unit in the name such that we can split them up accordingly later on

Copy link
Collaborator

@VictorVerhaert VictorVerhaert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR adds a lot of TODO's to the code. I think these should be added to (multiple) issues so we don't lose track of them, can plan them and prevent our code from becomming a TODO forest

temp.nc Outdated
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this file be commited?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably not

tests/test_openeo_gfmap/test_s1_fetchers.py Outdated Show resolved Hide resolved
@HansVRP HansVRP merged commit aa22537 into main Jul 15, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants