Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug?] [question] Confused about the params size of hash grid #468

Open
JackFishxxx opened this issue Sep 4, 2024 · 0 comments
Open

[bug?] [question] Confused about the params size of hash grid #468

JackFishxxx opened this issue Sep 4, 2024 · 0 comments

Comments

@JackFishxxx
Copy link

Thank you for the great work on this project! I am currently using the code and have some questions regarding the storage size calculation of the hash grid.

I noticed some discrepancies in the expected and actual storage sizes. Could you please clarify how the storage size for the hash grid is calculated in your implementation? Specifically:

>>>hash_grid_config = {
    "otype": "Grid",
    "type": "Dense",
    "n_levels": 8,
    "n_features_per_level": 8,
    "base_resolution": 2
}
>>>hash_grid = tcnn.Encoding(
    n_input_dims=2,
    encoding_config=hash_grid_config
)
>>>hash_grid.state_dict()["params"].shape
>>>torch.Size([699072])

the param size is correct here: (256**2+128**2+64**2+32**2+16**2+8**2+4**2+2**2)*8 = 699040, since the coarest level with 2**2 features would be aligned with 8, so the final param size should be: (256**2+128**2+64**2+32**2+16**2+8**2+4**2+8)*8 = 699072

Another case is as following:

>>>hash_grid_config = {
    "otype": "Grid",
    "type": "Dense",
    "n_levels": 1,
    "n_features_per_level": 8,
    "base_resolution": 1
}
>>>hash_grid = tcnn.Encoding(
    n_input_dims=2,
    encoding_config=hash_grid_config
)
>>>hash_grid.state_dict()["params"].shape
>>>torch.Size([64])

the param size is: (1)*8=8, but the number of features should be aligned with 8, so the final param is 8*8=64

However, the following case seems to be incorrect:

>>>hash_grid_config = {
    "otype": "Grid",
    "type": "Dense",
    "n_levels": 2,
    "n_features_per_level": 8,
    "base_resolution": 2
}
>>>hash_grid = tcnn.Encoding(
    n_input_dims=2,
    encoding_config=hash_grid_config
)
>>>hash_grid.state_dict()["params"].shape
>>>torch.Size([524352])

The correct number of params should be: (2**2+4**2)*8=160, while the 2**2 would be aligned with 8, so the final param size would be: (8+4**2)*8=192, which is different from the real results.

It really confused me. Did I miss something? I appreciate any insight you can provide. Thanks in advance for your assistance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant