Skip to content

Commit

Permalink
Run document update again (#1216)
Browse files Browse the repository at this point in the history
* misc changes to neox_args

* Update NeoXArgs docs automatically

---------

Co-authored-by: github-actions <[email protected]>
  • Loading branch information
jahatef and github-actions authored May 16, 2024
1 parent 49cd41f commit d037756
Showing 1 changed file with 44 additions and 3 deletions.
47 changes: 44 additions & 3 deletions configs/neox_arguments.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Logging Arguments

- **git_hash**: str

Default = 6fb840e
Default = 8d175ed

current git hash of repository

Expand Down Expand Up @@ -1201,7 +1201,7 @@ Text Generation arguments
- **num_experts**: int
- **moe_num_experts**: int
Default = 1
Expand Down Expand Up @@ -1243,7 +1243,7 @@ Text Generation arguments
- **moe_token_dropping**: bool
Default = True
Default = False
Whether to drop tokens when exceeding capacity
Expand Down Expand Up @@ -1273,6 +1273,47 @@ Text Generation arguments
- **moe_type**: str
Default = megablocks
Either `deepspeed` or `megablocks`
- **moe_glu**: bool
Default = False
Use gated linear units in MoE
- **moe_lbl_in_fp32**: bool
Default = False
Whether to compute the load balancing loss in fp32.
- **moe_jitter_eps**: float
Default = None
Coefficient for MoE routing jitter. Jitter is
not used if set to None
- **enable_expert_tensor_parallelism**: bool
Default = False
Enable expert tensor parallelism
## NeoXArgsTokenizer
Tokenizer Arguments
Expand Down

0 comments on commit d037756

Please sign in to comment.