-
Notifications
You must be signed in to change notification settings - Fork 453
Issues: pytorch/torchrec
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Passing
Device
as str
in EmbeddingBagCollectionSharder().shard
#2624
opened Dec 10, 2024 by
ArijitSinghEDA
[Bug] state_dict returns wrong path when DMP is as a submodule
#2584
opened Nov 22, 2024 by
JacoCheung
ShardedQuantEmbeddingBagCollection doesn't seem to be distributing the shards properly
#2575
opened Nov 21, 2024 by
Hanyu-Li
[Question/Bug] DP sharding parameters are inconsistent with others.
#2563
opened Nov 18, 2024 by
JacoCheung
[Question] Does TorchRec supports dist checking point / (DCP)
#2534
opened Nov 4, 2024 by
JacoCheung
RuntimeError: CUDA error: no kernel image is available for execution on the device
#2496
opened Oct 21, 2024 by
uncle-sann
[Question] Is there gradient accumulation support for training?
#2332
opened Aug 22, 2024 by
liuslnlp
How to share embeddings between an EmbeddingCollection and an EmbeddingBagCollection?
#2268
opened Aug 2, 2024 by
tiankongdeguiji
Cannot work when use DATA_PARALLEL with FusedEmbeddingBagCollection
#2209
opened Jul 4, 2024 by
imh966
[Bug][Dynamic Embedding] improper optimizier state_dict Something isn't working
momentum2
key while constructing PSCollection
bug
#2177
opened Jun 26, 2024 by
JacoCheung
Previous Next
ProTip!
Add no:assignee to see everything that’s not assigned.