Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(deps): bump litellm from 1.42.3 to 1.42.5 #460

Merged
merged 1 commit into from
Jul 30, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jul 29, 2024

Bumps litellm from 1.42.3 to 1.42.5.

Release notes

Sourced from litellm's releases.

v1.42.5

🔥 We're launching filtering LLMs by provider, max_tokens on https://models.litellm.ai 👉 View cost, max_tokens for 200+ LLMs (@​LiteLLM)

litellm_filters

🔭 [Feat] - log writing BatchSpendUpdate events on OTEL

🔑 Proxy Enterprise - security - check max request size

🛡️ [Feat Enterprise] - check max response size

✅ Feat Enterprise - set max request / response size UI

What's Changed

Full Changelog: BerriAI/litellm@v1.42.4...v1.42.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 149.07872144345131 6.351580011280877 0.0 1901 0 107.79980099999875 1698.2656079999856

... (truncated)

Commits
  • 0627468 fix checking mode on health checks
  • 9b69e50 Merge pull request #4927 from BerriAI/litellm_set_max_request_response_size_ui
  • 10e70f8 Merge pull request #4928 from BerriAI/litellm_check_response_size
  • b2f745f Merge pull request #4926 from BerriAI/litellm_check_max_request_size
  • 3511aad allow setting max request / response size on admin UI
  • f633f7d set max_response_size_mb
  • b2f7233 feat check check_response_size_is_safe
  • 41ca6fd feat - check max response size
  • 9efebc4 Merge pull request #4730 from danielbichuetti/fireworks-fix-cost-map
  • f9eca63 Merge pull request #4919 from yujonglee/fix-canary-dev
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [litellm](https://github.com/BerriAI/litellm) from 1.42.3 to 1.42.5.
- [Release notes](https://github.com/BerriAI/litellm/releases)
- [Commits](BerriAI/litellm@v1.42.3...v1.42.5)

---
updated-dependencies:
- dependency-name: litellm
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Jul 29, 2024
@lwaekfjlk lwaekfjlk merged commit f3af72f into main Jul 30, 2024
14 of 17 checks passed
@dependabot dependabot bot deleted the dependabot/pip/litellm-1.42.5 branch July 30, 2024 18:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant