Skip to content

Commit

Permalink
update config to v2
Browse files Browse the repository at this point in the history
  • Loading branch information
loubbrad committed Dec 4, 2024
1 parent 52b14e7 commit a7fe14b
Show file tree
Hide file tree
Showing 6 changed files with 25 additions and 36 deletions.
16 changes: 16 additions & 0 deletions config/accelerate.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
compute_environment: LOCAL_MACHINE
debug: false
distributed_type: 'NO'
downcast_bf16: 'no'
gpu_ids: all
machine_rank: 0
main_training_function: main
mixed_precision: bf16
num_machines: 1
num_processes: 1
rdzv_backend: static
same_network: true
tpu_env: []
tpu_use_cluster: false
tpu_use_sudo: false
use_cpu: false
9 changes: 0 additions & 9 deletions config/models/large-new.json

This file was deleted.

8 changes: 4 additions & 4 deletions config/models/large.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
{
"d_model": 1024,
"n_heads": 16,
"n_layers": 64,
"d_model": 2048,
"n_heads": 32,
"n_layers": 16,
"ff_mult": 4,
"drop_p": 0.0,
"max_seq_len": 4096,
"max_seq_len": 8192,
"grad_checkpoint": true
}
10 changes: 5 additions & 5 deletions config/models/medium.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
{
"d_model": 768,
"n_heads": 12,
"n_layers": 48,
"d_model": 1536,
"n_heads": 24,
"n_layers": 16,
"ff_mult": 4,
"drop_p": 0.0,
"max_seq_len": 4096,
"max_seq_len": 8192,
"grad_checkpoint": true
}
}
9 changes: 0 additions & 9 deletions config/models/small.json

This file was deleted.

9 changes: 0 additions & 9 deletions config/models/test.json

This file was deleted.

0 comments on commit a7fe14b

Please sign in to comment.