Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
Adibvafa committed Sep 21, 2024
2 parents 6b4e3cb + c452219 commit 885cae4
Showing 1 changed file with 12 additions and 1 deletion.
13 changes: 12 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ output = predict_dna_sequence(
tokenizer=tokenizer,
model=model,
attention_type="original_full",
deterministic=True
)
print(format_model_output(output))
```
Expand All @@ -86,13 +87,23 @@ M_UNK A_UNK L_UNK W_UNK M_UNK R_UNK L_UNK L_UNK P_UNK L_UNK L_UNK A_UNK L_UNK L_
-----------------------------
ATGGCTTTATGGATGCGTCTGCTGCCGCTGCTGGCGCTGCTGGCGCTGTGGGGCCCGGACCCGGCGGCGGCGTTTGTGAATCAGCACCTGTGCGGCAGCCACCTGGTGGAAGCGCTGTATCTGGTGTGCGGTGAGCGCGGCTTCTTCTACACGCCCAAAACCCGCCGCGAAGCGGAAGATCTGCAGGTGGGCCAGGTGGAGCTGGGCGGCTAA
```

### Generating Multiple Variable Sequences

Set `deterministic=False` to generate variable sequences. Control the variability using `temperature`:

- `temperature`: (recommended between 0.2 and 0.8)
- Lower values (e.g., 0.2): More conservative predictions
- Higher values (e.g., 0.8): More diverse predictions

Using high temperatures might result in prediction of DNA sequences that do not translate to the input protein. <br>
Generate multiple sequences by setting `num_sequences` to a value greater than 1.
<br>

**You can use the [inference template](https://github.com/Adibvafa/CodonTransformer/raw/main/src/CodonTransformer_inference_template.xlsx) for batch inference in [Google Colab](https://adibvafa.github.io/CodonTransformer/GoogleColab).**

<br>


## Installation
Install CodonTransformer via pip:

Expand Down

0 comments on commit 885cae4

Please sign in to comment.