Skip to content

Commit

Permalink
No debug
Browse files Browse the repository at this point in the history
  • Loading branch information
tgaddair committed Apr 2, 2024
1 parent 74d002e commit 5a8687a
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion server/lorax_server/models/flash_causal_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -271,7 +271,6 @@ def from_pb(
max_length = max(max_length, input_length + max_new_tokens)

adapter_indices = torch.cat(adapter_indices_list).to(dtype=torch.int64, device=device)
print("!!! ADAPTER INDICES", adapter_indices)

request_tokenizers = [
tokenizers.get_tokenizer(r.adapter_index, tokenizer)
Expand Down

0 comments on commit 5a8687a

Please sign in to comment.