-
-
Notifications
You must be signed in to change notification settings - Fork 90
Issues: explosion/spacy-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Update test suite to avoid crashing when Everything related to the test suite
en_core_web_md
is unavailable
tests
#471
opened May 17, 2024 by
svlandeg
Support a unified API (such as LiteLLM) for all LLM providers / models
#470
opened May 17, 2024 by
omri374
transformers
> 4.38 causes bug in inference for HF models
bug
#463
opened Apr 24, 2024 by
rmitsch
Potential REL sharding issue
bug
Something isn't working
feat/sharding
Everything related to sharding/map-reduce.
feat/task
Feature: tasks
#450
opened Mar 8, 2024 by
peter-axion
Many returns are not what I want
usage
How to use `spacy-llm`
#443
opened Feb 18, 2024 by
tianchiguaixia
How to surpass BERT through large models
usage
How to use `spacy-llm`
#442
opened Feb 17, 2024 by
tianchiguaixia
Working dummy example for custom LLM endpoint integration
usage
How to use `spacy-llm`
#436
opened Jan 29, 2024 by
borhenryk
[Warning] the current text generation call will exceed the model's predefined maximum length (4096).
feat/model
Feature: models
usage
How to use `spacy-llm`
#423
opened Jan 21, 2024 by
yileitu
FileNotFoundError: [Errno 2] No such file or directory: 'local-ner-cache/9963044417883968883.spacy'
bug
Something isn't working
feat/cache
Feature: caching
#414
opened Jan 18, 2024 by
nikolaysm
'<' not supported between instances of 'str' and 'int'
usage
How to use `spacy-llm`
#411
opened Jan 8, 2024 by
BaptisteLoquette
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.