Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] πŸš€πŸš€πŸš€ Transformers.js V3 πŸš€πŸš€πŸš€ #545

Open
wants to merge 447 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
447 commits
Select commit Hold shift + click to select a range
939f735
Formatting
xenova Jun 10, 2024
edfa94f
Optimize `spectogram` function
xenova Jun 10, 2024
490a4b8
Add rfft/stft tensor ops via ORT
xenova Jun 11, 2024
39bd0aa
Optimize top_k sampling
xenova Jun 11, 2024
4d2c819
Allow priority queue to specify max heap size
xenova Jun 11, 2024
49ace06
Create tiny random pipeline test (fill-mask)
xenova Jun 11, 2024
9ac56c8
Create text-classification pipeline test
xenova Jun 11, 2024
3eaa484
Add `problem_type = 'multi_label_classification'` test
xenova Jun 11, 2024
f45cdd8
Support top_k w/ `k < dims.at(-1)`
xenova Jun 12, 2024
a018842
Create image-classification tiny pipeline test
xenova Jun 12, 2024
4db5afd
Create audio-classification tiny pipeline test
xenova Jun 12, 2024
fe931f1
Add support for cohere models
xenova Jun 12, 2024
9ee75bd
Add `CohereModel` unit test
xenova Jun 12, 2024
3ecfcc1
Add token-classification pipeline test
xenova Jun 12, 2024
222669f
Add question-answering pipeline test
xenova Jun 12, 2024
e0a7a4d
(temp) Ignore beam search
xenova Jun 13, 2024
2a3a71b
Implement `ForcedEOSTokenLogitsProcessor`
xenova Jun 13, 2024
6006fe7
Update pipelines
xenova Jun 14, 2024
168588a
Create webgpu depth estimation demo
xenova Jun 14, 2024
fc53d15
Update package-lock.json
xenova Jun 18, 2024
db7c1f3
Add support for `MobileNet` (v1/v2/v3/v4)
xenova Jun 18, 2024
63cdcb2
Fix vision-encoder-decoder configs
xenova Jun 20, 2024
c32476c
Use switch statement for model type checks
xenova Jun 20, 2024
6a886f4
Re-enable text-to-audio pipeline
xenova Jun 21, 2024
fd38113
Support fallback for not-found errors
xenova Jun 21, 2024
d3c56f3
Add florence2 configs
xenova Jun 21, 2024
65ce093
Implement `Florence2ForConditionalGeneration`
xenova Jun 21, 2024
666dfcc
Create `Florence2Processor`
xenova Jun 21, 2024
0a5bbd9
Add florence2 unit tests
xenova Jun 21, 2024
4ca37d4
Throw error if neither `decoder_input_ids` nor `decoder_inputs_embeds…
xenova Jun 21, 2024
dab703d
Improve merging input embeds
xenova Jun 21, 2024
11da868
Add comments
xenova Jun 21, 2024
1ab49a5
Map ngram bigint -> number
xenova Jun 22, 2024
2649fd6
Add florence2 to list of supported models
xenova Jun 22, 2024
4d92c7f
Minor conversion script improvements
xenova Jun 22, 2024
e8cfaf6
add onnxslim intergration (#811)
inisis Jun 22, 2024
af37e87
Update onnxslim version
xenova Jun 22, 2024
e4c4848
Remove debug log
xenova Jun 25, 2024
8cfc9ad
Implement `post_process_generation` for florence2 processor
xenova Jun 25, 2024
92954af
Rename `construct_prompts` method
xenova Jun 26, 2024
1f4ad16
Create Florence-2 demo
xenova Jun 26, 2024
aa97d1e
Remove unused line
xenova Jun 28, 2024
f37eae0
Add `MobileCLIP` to list of supported models
xenova Jun 28, 2024
10e2431
Add support for RT-DETR models
xenova Jun 28, 2024
4cb26a1
Support per-model `use_external_data_format`
xenova Jun 28, 2024
7ede237
Ensure `options.use_external_data_format !== null`
xenova Jun 28, 2024
29af02c
Fix musicgen progress
xenova Jul 1, 2024
0b5469b
Fix custom whisper configs
xenova Jul 2, 2024
14a2990
Improve custom whisper ONNX config
xenova Jul 2, 2024
d010cc2
Fix whisper word-level timestamps
xenova Jul 2, 2024
5ab1ea3
Optimize DTW (50x!)
xenova Jul 2, 2024
0dba266
Early dereferencing for performance boosts
xenova Jul 2, 2024
5e4e20f
cleanup
xenova Jul 2, 2024
dd6af93
Move quantization logic to `quantize.py`
xenova Jul 3, 2024
04af3d5
update deps
xenova Jul 3, 2024
9128651
Fix q4 quantization
xenova Jul 3, 2024
83cbb21
save q4 quantization
xenova Jul 4, 2024
eb61344
Add decode ASR test
xenova Jul 4, 2024
cec2400
Do not process last chunk unnecessarily
xenova Jul 4, 2024
c835b54
fp16 disable_shape_infer if model is too large
xenova Jul 4, 2024
45cd8d4
Use `check_and_save_model` for saving fp16 model
xenova Jul 4, 2024
88f3e44
Reorder functions
xenova Jul 4, 2024
23440f0
formatting
xenova Jul 4, 2024
b411e9f
Remove debug log
xenova Jul 4, 2024
04a334a
Fix q8 quantization for models > 2GB
xenova Jul 4, 2024
cd1ea69
correct attribute
xenova Jul 4, 2024
a167f6e
Fix `TextGenerationPipeline`
xenova Jul 4, 2024
ea73289
Fix pauses in whisper word-level timestamps
xenova Jul 4, 2024
344af32
Formatting
xenova Jul 4, 2024
c305c38
Sort added tokens by length to avoid early partial matches
xenova Jul 5, 2024
d6f6fd4
Add new tokenizer test
xenova Jul 8, 2024
1557b8d
Only finish with newline if running in Node.js
xenova Jul 8, 2024
9ac7ceb
Consider token timestamps when selecting longest common sequence
xenova Jul 9, 2024
79ed46e
Create whisper word-level timestamps demo
xenova Jul 10, 2024
8da6886
cleanup
xenova Jul 10, 2024
d709bd0
Fallback to WASM if WebGPU not supported
xenova Jul 10, 2024
9ef3a6d
Reload model for each quantization mode
xenova Jul 12, 2024
9787b75
Update converstion script requirements
xenova Jul 12, 2024
974f086
Separate IO and Quantization args
xenova Jul 12, 2024
d042868
Use `const` where possible
xenova Jul 16, 2024
1b4d242
Add `InterruptableStoppingCriteria`
xenova Jul 16, 2024
31101c8
`@xenova/transformers` -> `@huggingface/transformers`
xenova Jul 17, 2024
e84322b
Override semver version
xenova Jul 17, 2024
bd94334
Add support for pyannote models
xenova Jul 17, 2024
3dbc633
Update README.md
xenova Jul 17, 2024
858e55d
Add listed support for pyannote
xenova Jul 17, 2024
8bf0349
Add pyannote example code
xenova Jul 17, 2024
c52618c
Support specifying `min_num_frames`
xenova Jul 17, 2024
96f19b0
Support simultaneous instantiation of multiple inference sessions
xenova Jul 20, 2024
4ad43e2
Support broadcasting encoder outputs over decoder inputs
xenova Jul 22, 2024
c6aeb4b
Fix test
xenova Jul 22, 2024
6d3ea4b
fix bundler config for latest ORT
fs-eire Jul 25, 2024
38a3bf6
Only check fp16 support for webgpu device
xenova Jul 29, 2024
9df84c4
Remove default chat templates
xenova Aug 7, 2024
fc3d860
Add support for gemma2
xenova Aug 7, 2024
939920d
Add gemma2 generation test
xenova Aug 7, 2024
5bb93a0
Update gemma2 config mapping
xenova Aug 7, 2024
72ec168
Prioritize high-performance adapter when possible
xenova Aug 7, 2024
9068a53
Set defaults for `tools` and `documents` in `apply_chat_template`
xenova Aug 7, 2024
824538b
bump `@huggingface/jinja` -> 0.3.0
xenova Aug 7, 2024
836c0af
Add `apply_chat_template` default parameters unit test
xenova Aug 7, 2024
487d8b2
Merge branch 'v3' into @huggingface/transformers
xenova Aug 7, 2024
1f6e0e1
Add prettier
xenova Aug 7, 2024
55494d1
prettier format config files
xenova Aug 7, 2024
5a68461
remove incorrect comment
xenova Aug 7, 2024
437cb34
Merge branch 'pr/864' into @huggingface/transformers
xenova Aug 7, 2024
5a6c926
Update onnxruntime-web version
xenova Aug 7, 2024
b19251b
Update webpack.config.js
xenova Aug 7, 2024
820c1e2
Fix copy path
xenova Aug 7, 2024
b0dab91
Run `npm ci`
xenova Aug 7, 2024
86b9b62
Fix bundling
xenova Aug 7, 2024
222b94e
Do not set `preferredOutputLocation` if we are proxying
xenova Aug 7, 2024
b326cc9
Merge branch 'v3' into @huggingface/transformers
xenova Aug 7, 2024
ca67092
Update `@webgpu/types`
xenova Aug 7, 2024
42076fd
Update SAM example
xenova Aug 7, 2024
48d3142
Use `??=` operator where possible
xenova Aug 7, 2024
3b1a4fd
Fix commonjs usage
xenova Aug 8, 2024
9a73b5e
Mark `onnxruntime-node` and `sharp` as externals
xenova Aug 8, 2024
9951aa5
Move `externals` into config
xenova Aug 8, 2024
c04d37e
Downgrade to onnxruntime 1.18.0
xenova Aug 8, 2024
d32fe2b
Finalize module/commonjs build
xenova Aug 8, 2024
1530d50
Separate web and node builds
xenova Aug 8, 2024
b4df0e2
[version] Update to 3.0.0-alpha.1
xenova Aug 8, 2024
ab59c51
Default to CDN-hosted .wasm files
xenova Aug 8, 2024
866b219
[version] Update to 3.0.0-alpha.2
xenova Aug 8, 2024
4a3398d
bump versions
xenova Aug 8, 2024
8891a14
[version] Update to 3.0.0-alpha.3
xenova Aug 8, 2024
a315933
Merge branch 'improve-conversion-script' into v3
xenova Aug 8, 2024
12569b8
Consolidate conversion and quantization script
xenova Aug 9, 2024
83f5718
Downgrade `onnxconverter-common`
xenova Aug 9, 2024
6fa5fa6
Link to types in exports
xenova Aug 9, 2024
2f1b210
Update list of supported tasks
xenova Aug 10, 2024
27bc55d
Fixed unit tests
xenova Aug 10, 2024
23d1150
Update imports
xenova Aug 10, 2024
f9070dc
Bump versions to `3.0.0-alpha.4`
xenova Aug 10, 2024
c3494e1
[version] Update to 3.0.0-alpha.4
xenova Aug 10, 2024
973fb0d
Fix "Default condition should be last one"
xenova Aug 12, 2024
7376ecf
Bump versions
xenova Aug 12, 2024
0a04bc0
[version] Update to 3.0.0-alpha.5
xenova Aug 12, 2024
e4603cd
Update next.js client-side demo
xenova Aug 12, 2024
ff1853c
Initial WebNN Support
ibelem Aug 14, 2024
15574bc
Mark fs, path and url as external packages for node build
xenova Aug 15, 2024
7282862
Move content type map outside of `FileResponse` object
xenova Aug 15, 2024
22f7ced
Add GPU support for Node.js
xenova Aug 15, 2024
1e319a4
Bump versions
xenova Aug 15, 2024
d278891
[version] Update to 3.0.0-alpha.6
xenova Aug 15, 2024
3fefa17
Fix conflicts
ibelem Aug 16, 2024
fa6cc70
bump dependency versions
xenova Aug 16, 2024
7fa5326
Add support for device auto-detection
xenova Aug 16, 2024
4ec77c1
Fix default device selection
xenova Aug 16, 2024
5799e30
Merge branch 'pr/ibelem/890-1' into v3
xenova Aug 16, 2024
5b2cac2
Improve WebNN selection
xenova Aug 17, 2024
ad23c50
Skip token callback if `skip_prompt` is set
xenova Aug 17, 2024
5b84b62
Bump versions
xenova Aug 19, 2024
bcf6a86
[version] Update to 3.0.0-alpha.7
xenova Aug 19, 2024
b97ed0d
bump versions
xenova Aug 21, 2024
c5b7083
[version] Update to 3.0.0-alpha.8
xenova Aug 21, 2024
cbeefde
bump versions
xenova Aug 23, 2024
59600f2
[version] Update to 3.0.0-alpha.9
xenova Aug 23, 2024
b2e025a
Add support for Sapiens
xenova Aug 27, 2024
8661d95
Update default ONNX env
xenova Aug 27, 2024
57db34d
Fix types
xenova Aug 27, 2024
1b7f978
Topologically sort fp16 nodes
xenova Aug 27, 2024
45d1526
Add marian unit test
xenova Aug 27, 2024
b903757
Re-order imports
xenova Aug 27, 2024
633976f
Fix `NoBadWordsLogitsProcessor`
xenova Aug 27, 2024
24d8787
Update package.json
xenova Aug 27, 2024
9412ec4
[jest] Disable coverage
xenova Aug 27, 2024
08e7388
Bump versions
xenova Aug 27, 2024
d5a8f87
[version] Update to 3.0.0-alpha.10
xenova Aug 27, 2024
7843ad0
Improve node/web interoperability
xenova Aug 28, 2024
bf093ae
Fix scripts/requirements.txt
xenova Aug 28, 2024
9a5ee42
Bump versions
xenova Aug 28, 2024
535cdfe
[version] Update to 3.0.0-alpha.11
xenova Aug 28, 2024
4e1acf0
Add support for JAIS models (#906)
xenova Aug 28, 2024
488548d
Add JAIS to README
xenova Aug 28, 2024
13aed41
Fix node/web interop (again)
xenova Aug 28, 2024
7655f81
Bump versions
xenova Aug 28, 2024
1c7e226
[version] Update to 3.0.0-alpha.12
xenova Aug 28, 2024
ab6b28b
Set `SapiensForNormalEstimation` to encoder-only
xenova Aug 28, 2024
66c05d5
Implement `sub` tensor operation
xenova Aug 28, 2024
31e8b2a
Bump versions
xenova Aug 28, 2024
bf3f7d5
[version] Update to 3.0.0-alpha.13
xenova Aug 28, 2024
c025356
Improve typing for `wrap` helper function
xenova Aug 28, 2024
7ebdaf2
Update `preferredOutputLocation` type
xenova Aug 28, 2024
3b8ddcb
Make `wrap` type more generic
xenova Aug 28, 2024
a385c6e
Re-use `segmentation_data`
xenova Aug 28, 2024
537e958
Fix `min` type
xenova Aug 28, 2024
bcb28b3
Add support for Hiera models
xenova Aug 29, 2024
d21c87c
Fix reused loop variable (closes #910)
xenova Aug 30, 2024
1d281f6
Add logits processor test file
xenova Aug 30, 2024
ba0427f
Fix test imports
xenova Aug 30, 2024
3bc3e86
Bump versions
xenova Aug 30, 2024
0518960
[version] Update to 3.0.0-alpha.14
xenova Aug 30, 2024
552cdea
Add another `bad_words` logits processor test (closes #913)
xenova Aug 30, 2024
3422a8b
Add support for GroupViT
xenova Aug 30, 2024
3599902
Add zero-shot-image-classification unit test
xenova Aug 30, 2024
5892ee8
Add maskformer model definitions
xenova Aug 30, 2024
c4dac77
Support universal image segmentation in `image-segmentation` pipeline
xenova Aug 30, 2024
f0c47be
Add support for PVT models
xenova Aug 30, 2024
d80d3a4
Add `post_process_instance_segmentation` function template
xenova Aug 30, 2024
844099d
Add `library_name` option to convert.py
xenova Sep 2, 2024
ba5d725
Wrap onnxslim with try block
xenova Sep 2, 2024
b3691c8
Use const where possible
xenova Sep 2, 2024
dcf117f
Use const where possible (again)
xenova Sep 2, 2024
9af026c
Create `MaskFormerFeatureExtractor`
xenova Sep 2, 2024
0f8200c
Add support for MaskFormer
xenova Sep 2, 2024
e278c5e
Improve tool-use chat template detection
xenova Sep 2, 2024
83fa58f
Add object detection pipeline unit test
xenova Sep 2, 2024
86d6da4
Add support for ViTMSN and VitMAE
xenova Sep 2, 2024
93b25fb
Bump ORT versions
xenova Sep 7, 2024
2f680ee
Create `get_chat_template` helper function
xenova Sep 7, 2024
2f9b2ed
Fix CI
xenova Sep 9, 2024
deec350
Run prettier on `tests/**`
xenova Sep 9, 2024
48fa226
move certain tests to utils subfolder
xenova Sep 9, 2024
a10828f
Bump onnxruntime-web version
xenova Sep 9, 2024
ba58ea2
Bump `onnxruntime==1.19.2` in scripts/requirements.txt
xenova Sep 9, 2024
4f17e95
Merge branch 'main' into v3
xenova Sep 9, 2024
c40a151
Merge branch 'main' into v3
xenova Sep 9, 2024
30315b2
Sort `this.added_tokens` before creating regex (`.toSorted` is not av…
xenova Sep 9, 2024
d7df575
Rather make a copy of `this.added_tokens`
xenova Sep 9, 2024
a519379
Fix `.tokenize` with `fuse_unk=true`
xenova Sep 9, 2024
89ddccf
Add blenderbot tokenizer tests
xenova Sep 9, 2024
36ad144
Add t5 tokenizer tests
xenova Sep 9, 2024
4765dd6
Add falcon tokenizer tests
xenova Sep 10, 2024
fd8b9a2
Run prettier
xenova Sep 10, 2024
710816e
Add ESM tokenizer tests
xenova Sep 10, 2024
0d3cd30
Run unit tests in parallel
xenova Sep 10, 2024
cc258c2
Fix `fuse_unk` for tokenizers with `byte_fallback=true` but no byte f…
xenova Sep 10, 2024
4798755
Add llama tokenizer unit tests
xenova Sep 10, 2024
c6c3ae1
Update emoji test string names
xenova Sep 10, 2024
79a7409
Move whisper-specific unit tests to subfolder
xenova Sep 10, 2024
1a38804
Code formatting
xenova Sep 10, 2024
dabe6ae
Bump versions
xenova Sep 10, 2024
54f1f21
[version] Update to 3.0.0-alpha.15
xenova Sep 10, 2024
a912d79
Add emoji tokenizer test cases for LlamaTokenizer
xenova Sep 12, 2024
969d10e
Attempt to fix encoder-decoder memory leak
xenova Sep 17, 2024
072cbbc
Remove unused code
xenova Sep 17, 2024
14b4bd4
Fix BertNormalizer (strip `Mn` unicode characters)
xenova Sep 17, 2024
6797771
Handle ZERO WIDTH JOINER (U+200D) characters
xenova Sep 17, 2024
f148afd
Add more spm normalization characters
xenova Sep 17, 2024
ca4b5b9
Add emoji unit tests for bert/t5
xenova Sep 17, 2024
113c81e
[WebNN] Add support for specifying `free_dimension_overrides` in config
xenova Sep 18, 2024
9005acc
Log warning if webnn is selected by `free_dimension_overrides` is not…
xenova Sep 18, 2024
682c7d0
Fix unigram for multi-byte tokens
xenova Sep 18, 2024
4a31e54
Add gemma tokenizer tests
xenova Sep 22, 2024
7a16065
Allow user to specify device and dtype in config.json
xenova Sep 23, 2024
4c1d21b
Update dependency versions
xenova Sep 23, 2024
3c6a95a
Bump versions
xenova Sep 23, 2024
ac391d2
[version] Update to 3.0.0-alpha.16
xenova Sep 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,15 @@ on:
pull_request:
branches:
- main

env:
TESTING_REMOTELY: true
types:
- opened
- reopened
- synchronize
- ready_for_review

jobs:
build:
if: github.event.pull_request.draft == false
runs-on: ubuntu-latest

strategy:
Expand All @@ -27,11 +30,9 @@ jobs:
node-version: ${{ matrix.node-version }}
- run: npm ci
- run: npm run build
- run: pip install -r tests/requirements.txt

# Setup the testing environment
- run: npm run generate-tests
- run: git lfs install && GIT_CLONE_PROTECTION_ACTIVE=false git clone https://huggingface.co/Xenova/t5-small ./models/t5-small
- run: git lfs install && GIT_CLONE_PROTECTION_ACTIVE=false git clone https://huggingface.co/hf-internal-testing/tiny-random-T5ForConditionalGeneration ./models/hf-internal-testing/tiny-random-T5ForConditionalGeneration

# Actually run tests
- run: npm run test
8 changes: 8 additions & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Ignore artifacts:
.github
dist
docs
examples
scripts
types
*.md
10 changes: 10 additions & 0 deletions .prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"overrides": [
{
"files": ["tests/**/*.js"],
"options": {
"printWidth": 10000000
}
}
]
}
69 changes: 51 additions & 18 deletions README.md

Large diffs are not rendered by default.

26 changes: 18 additions & 8 deletions docs/scripts/build_readme.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,29 @@
<p align="center">
<br/>
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/xenova/transformers.js/assets/26504141/bd047e0f-aca9-4ff7-ba07-c7ca55442bc4" width="500" style="max-width: 100%;">
<source media="(prefers-color-scheme: light)" srcset="https://github.com/xenova/transformers.js/assets/26504141/84a5dc78-f4ea-43f4-96f2-b8c791f30a8e" width="500" style="max-width: 100%;">
<img alt="transformers.js javascript library logo" src="https://github.com/xenova/transformers.js/assets/26504141/84a5dc78-f4ea-43f4-96f2-b8c791f30a8e" width="500" style="max-width: 100%;">
<source media="(prefers-color-scheme: dark)" srcset="https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/transformersjs-dark.svg" width="500" style="max-width: 100%;">
<source media="(prefers-color-scheme: light)" srcset="https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/transformersjs-light.svg" width="500" style="max-width: 100%;">
<img alt="transformers.js javascript library logo" src="https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/transformersjs-light.svg" width="500" style="max-width: 100%;">
</picture>
<br/>
</p>

<p align="center">
<a href="https://www.npmjs.com/package/@xenova/transformers"><img alt="NPM" src="https://img.shields.io/npm/v/@xenova/transformers"></a>
<a href="https://www.npmjs.com/package/@xenova/transformers"><img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@xenova/transformers"></a>
<a href="https://www.jsdelivr.com/package/npm/@xenova/transformers"><img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@xenova/transformers"></a>
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE"><img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue"></a>
<a href="https://huggingface.co/docs/transformers.js/index"><img alt="Documentation" src="https://img.shields.io/website/http/huggingface.co/docs/transformers.js/index.svg?down_color=red&down_message=offline&up_message=online"></a>
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers">
</a>
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers">
</a>
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers">
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers">
</a>
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE">
<img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue">
</a>
<a href="https://huggingface.co/docs/transformers.js/index">
<img alt="Documentation" src="https://img.shields.io/website/http/huggingface.co/docs/transformers.js/index.svg?down_color=red&down_message=offline&up_message=online">
</a>
</p>

{intro}
Expand Down
6 changes: 3 additions & 3 deletions docs/snippets/0_introduction.snippet
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@ State-of-the-art Machine Learning for the web. Run πŸ€— Transformers directly in

Transformers.js is designed to be functionally equivalent to Hugging Face's [transformers](https://github.com/huggingface/transformers) python library, meaning you can run the same pretrained models using a very similar API. These models support common tasks in different modalities, such as:
- πŸ“ **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.
- πŸ–ΌοΈ **Computer Vision**: image classification, object detection, and segmentation.
- πŸ—£οΈ **Audio**: automatic speech recognition and audio classification.
- πŸ™ **Multimodal**: zero-shot image classification.
- πŸ–ΌοΈ **Computer Vision**: image classification, object detection, segmentation, and depth estimation.
- πŸ—£οΈ **Audio**: automatic speech recognition, audio classification, and text-to-speech.
- πŸ™ **Multimodal**: embeddings, zero-shot audio classification, zero-shot image classification, and zero-shot object detection.

Transformers.js uses [ONNX Runtime](https://onnxruntime.ai/) to run models in the browser. The best part about it, is that you can easily [convert](#convert-your-models-to-onnx) your pretrained PyTorch, TensorFlow, or JAX models to ONNX using [πŸ€— Optimum](https://github.com/huggingface/optimum#onnx--onnx-runtime).

Expand Down
2 changes: 1 addition & 1 deletion docs/snippets/1_quick-tour.snippet
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ out = pipe('I love transformers!')
<td>

```javascript
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';

// Allocate a pipeline for sentiment-analysis
let pipe = await pipeline('sentiment-analysis');
Expand Down
6 changes: 3 additions & 3 deletions docs/snippets/2_installation.snippet
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@

To install via [NPM](https://www.npmjs.com/package/@xenova/transformers), run:
To install via [NPM](https://www.npmjs.com/package/@huggingface/transformers), run:
```bash
npm i @xenova/transformers
npm i @huggingface/transformers
```

Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with:
```html
<script type="module">
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.2';
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.0.0-alpha.16';
</script>
```
5 changes: 2 additions & 3 deletions docs/snippets/4_custom-usage.snippet
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@


By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/), which should work out-of-the-box. You can customize this as follows:

By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/[email protected]/dist/), which should work out-of-the-box. You can customize this as follows:

### Settings

```javascript
import { env } from '@xenova/transformers';
import { env } from '@huggingface/transformers';

// Specify a custom location for models (defaults to '/models/').
env.localModelPath = '/path/to/models/';
Expand Down
Loading
Loading