Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pull] master from mudler:master #73

Merged
merged 91 commits into from
May 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
962ebba
models(gallery): fixup phi-3 sha
mudler May 1, 2024
6a7a799
:arrow_up: Update ggerganov/llama.cpp (#2213)
localai-bot May 1, 2024
4690b53
feat: user defined inference device for CUDA and OpenVINO (#2212)
fakezeta May 2, 2024
e5bd9a7
models(gallery): add wizardlm2 (#2209)
mudler May 2, 2024
f7f8b48
models(gallery): Add Hermes-2-Pro-Llama-3-8B-GGUF (#2218)
mudler May 2, 2024
2c5a46b
feat(ux): Add chat, tts, and image-gen pages to the WebUI (#2222)
mudler May 2, 2024
2cc1bd8
:arrow_up: Update ggerganov/llama.cpp (#2224)
localai-bot May 2, 2024
a31d00d
feat(aio): switch to llama3-based for LLM (#2225)
mudler May 2, 2024
b58274b
feat(ui): support multilineand style `ul` (#2226)
mudler May 2, 2024
dc834cc
Update README.md
mudler May 3, 2024
a0aa5d0
feat: update ROCM and use smaller image (#2196)
cryptk May 3, 2024
929a68c
:arrow_up: Update docs version mudler/LocalAI (#2228)
localai-bot May 3, 2024
da0b6a8
:arrow_up: Update ggerganov/llama.cpp (#2229)
localai-bot May 3, 2024
ac0f3d6
:arrow_up: Update ggerganov/whisper.cpp (#2230)
localai-bot May 3, 2024
daba8a8
build(deps): bump tqdm from 4.65.0 to 4.66.3 in /examples/langchain/l…
dependabot[bot] May 3, 2024
54faaa8
fix(webui): correct documentation URL for text2img (#2233)
mudler May 4, 2024
fa10302
docs: updated Transformer parameters description (#2234)
fakezeta May 4, 2024
530bec9
feat(llama.cpp): do not specify backends to autoload and add llama.cp…
mudler May 4, 2024
06c43ca
fix(gallery): hermes-2-pro-llama3 models checksum changed (#2236)
Nold360 May 4, 2024
b70e2bf
models(gallery): add moondream2 (#2237)
mudler May 4, 2024
92f7feb
models(gallery): add llama3-llava (#2238)
mudler May 4, 2024
17e94fb
:arrow_up: Update ggerganov/llama.cpp (#2239)
localai-bot May 4, 2024
117c987
fix(webui): display small navbar with smaller screens (#2240)
mudler May 4, 2024
b69ff46
feat(startup): show CPU/GPU information with --debug (#2241)
mudler May 5, 2024
f2d3506
models(gallery): moondream2 fixups
mudler May 5, 2024
ab4ee54
models(gallery): add llama3-instruct-coder (#2242)
mudler May 5, 2024
f50c6a4
models(gallery): update poppy porpoise (#2243)
mudler May 5, 2024
3096566
models(gallery): poppy porpoise fix
mudler May 5, 2024
f3bcc64
models(gallery): add icon for instruct-coder
mudler May 5, 2024
810e8e5
models(gallery): add lumimaid (#2244)
mudler May 5, 2024
5cb96fe
models(gallery): add openbiollm (#2245)
mudler May 5, 2024
67ad353
Update README.md
mudler May 5, 2024
c579850
feat(single-build): generate single binaries for releases (#2246)
mudler May 5, 2024
b52ff12
test: check the response URL during image gen in `app_test.go` (#2248)
dave-gray101 May 5, 2024
c547502
:arrow_up: Update ggerganov/llama.cpp (#2251)
localai-bot May 5, 2024
169d8d2
gallery: Added some OpenVINO models (#2249)
fakezeta May 6, 2024
477655f
models(gallery): average_norrmie reupload
mudler May 6, 2024
581b894
:arrow_up: Update ggerganov/llama.cpp (#2255)
localai-bot May 6, 2024
fe055d4
feat(webui): ux improvements (#2247)
mudler May 6, 2024
fea9522
fix: OpenVINO winograd always disabled (#2252)
fakezeta May 7, 2024
d3ddc9e
UI: flag `trust_remote_code` to users // favicon support (#2253)
dave-gray101 May 7, 2024
d1e3436
Update readme: add ShellOracle to community integrations (#2254)
djcopley May 7, 2024
e28ba4b
Add missing Homebrew dependencies (#2256)
michaelmior May 7, 2024
995aa5e
:arrow_up: Update ggerganov/llama.cpp (#2263)
localai-bot May 7, 2024
02ec546
models(gallery): Add Soliloquy (#2260)
mudler May 7, 2024
6559ac1
feat(ui): prompt for chat, support vision, enhancements (#2259)
mudler May 7, 2024
5ff5f0b
fix(ux): fix small glitches (#2265)
mudler May 8, 2024
5bf56e0
models(gallery): add tess (#2266)
mudler May 8, 2024
ed4f412
models(gallery): add lumimaid variant (#2267)
mudler May 8, 2024
d6f76c7
models(gallery): add kunocchini (#2268)
mudler May 8, 2024
b20354b
models(gallery): add aurora (#2270)
mudler May 8, 2024
6eb77f0
models(gallery): add tiamat (#2269)
mudler May 8, 2024
b66baa3
:arrow_up: Update docs version mudler/LocalAI (#2271)
localai-bot May 8, 2024
0809e9e
models(gallery): fix openbiollm typo
mudler May 8, 2024
eca5200
:arrow_up: Update ggerganov/llama.cpp (#2272)
localai-bot May 8, 2024
ea777f8
models(gallery): update SHA for einstein
mudler May 8, 2024
d651f39
:arrow_up: Update ggerganov/whisper.cpp (#2273)
localai-bot May 8, 2024
bc272d1
ci: add checksum checker pipeline (#2274)
mudler May 8, 2024
1937118
Update checksum_checker.yaml
mudler May 8, 2024
6440b60
Update checksum_checker.yaml
mudler May 8, 2024
fd2d89d
Update checksum_checker.sh
mudler May 8, 2024
222d714
Update checksum_checker.yaml
mudler May 8, 2024
0baacca
Update checksum_checker.yaml
mudler May 8, 2024
cb6ddb2
Update checksum_checker.yaml
mudler May 8, 2024
9b4c6f3
Update checksum_checker.yaml
mudler May 8, 2024
9786bb8
ci: try to fix checksum_checker.sh
mudler May 9, 2024
6a209cb
ci: get file name correctly in checksum_checker.sh
mudler May 9, 2024
650ae62
ci: get latest git version
mudler May 9, 2024
f69de3b
models(gallery): :arrow_up: update checksum (#2278)
localai-bot May 9, 2024
18a0424
:arrow_up: Update ggerganov/llama.cpp (#2281)
localai-bot May 9, 2024
e676809
:arrow_up: Update docs version mudler/LocalAI (#2280)
localai-bot May 10, 2024
28a421c
feat: migrate python backends from conda to uv (#2215)
cryptk May 10, 2024
4db41b7
models(gallery): add aloe (#2283)
mudler May 10, 2024
9b09eb0
build: do not specify a BUILD_ID by default (#2284)
mudler May 10, 2024
88d0aa1
docs: update function docs
mudler May 10, 2024
9e8b344
Update openai-functions.md
mudler May 10, 2024
cf513ef
Update openai-functions.md
mudler May 10, 2024
93e581d
:arrow_up: Update ggerganov/llama.cpp (#2285)
localai-bot May 10, 2024
7f4febd
models(gallery): add Llama-3-8B-Instruct-abliterated (#2288)
mudler May 11, 2024
e2de8a8
feat: create bash library to handle install/run/test of python backen…
cryptk May 11, 2024
dfc4207
:arrow_up: Update ggerganov/llama.cpp (#2290)
localai-bot May 11, 2024
efa32a2
feat(grammar): support models with specific construct (#2291)
mudler May 11, 2024
88942e4
fix: add missing openvino/optimum/etc libraries for Intel, fixes #228…
cryptk May 12, 2024
1b69b33
docs: Update semantic-todo/README.md (#2294)
eltociear May 12, 2024
ca14f95
models(gallery): add l3-chaoticsoliloquy-v1.5-4x8b (#2295)
mudler May 12, 2024
98af0b5
models(gallery): add jsl-medllama-3-8b-v2.0 (#2296)
mudler May 12, 2024
310b217
models(gallery): add llama-3-refueled (#2297)
mudler May 12, 2024
9d8c705
feat(ui): display number of available models for installation (#2298)
mudler May 12, 2024
5b79bd0
add setuptools for openvino (#2301)
fakezeta May 12, 2024
5534b13
feat(swagger): update swagger (#2302)
localai-bot May 12, 2024
b4cb22f
:arrow_up: Update ggerganov/llama.cpp (#2303)
localai-bot May 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ examples/chatbot-ui/models
examples/rwkv/models
examples/**/models
Dockerfile*
__pycache__

# SonarQube
.scannerwork
.scannerwork

# backend virtual environments
**/venv
backend/python/**/source
111 changes: 111 additions & 0 deletions .github/checksum_checker.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
#!/bin/bash
# This scripts needs yq and huggingface_hub to be installed
# to install hugingface_hub run pip install huggingface_hub

# Path to the input YAML file
input_yaml=$1

# Function to download file and check checksum using Python
function check_and_update_checksum() {
model_name="$1"
file_name="$2"
uri="$3"
old_checksum="$4"
idx="$5"

# Download the file and calculate new checksum using Python
new_checksum=$(python3 -c "
import hashlib
from huggingface_hub import hf_hub_download
import requests
import sys
import os

uri = '$uri'
file_name = uri.split('/')[-1]

# Function to parse the URI and determine download method
# Function to parse the URI and determine download method
def parse_uri(uri):
if uri.startswith('huggingface://'):
repo_id = uri.split('://')[1]
return 'huggingface', repo_id.rsplit('/', 1)[0]
elif 'huggingface.co' in uri:
parts = uri.split('/resolve/')
if len(parts) > 1:
repo_path = parts[0].split('https://huggingface.co/')[-1]
return 'huggingface', repo_path
return 'direct', uri

def calculate_sha256(file_path):
sha256_hash = hashlib.sha256()
with open(file_path, 'rb') as f:
for byte_block in iter(lambda: f.read(4096), b''):
sha256_hash.update(byte_block)
return sha256_hash.hexdigest()

download_type, repo_id_or_url = parse_uri(uri)

# Decide download method based on URI type
if download_type == 'huggingface':
try:
file_path = hf_hub_download(repo_id=repo_id_or_url, filename=file_name)
except Exception as e:
print(f'Error from Hugging Face Hub: {str(e)}', file=sys.stderr)
sys.exit(2)
else:
response = requests.get(repo_id_or_url)
if response.status_code == 200:
with open(file_name, 'wb') as f:
f.write(response.content)
file_path = file_name
elif response.status_code == 404:
print(f'File not found: {response.status_code}', file=sys.stderr)
sys.exit(2)
else:
print(f'Error downloading file: {response.status_code}', file=sys.stderr)
sys.exit(1)

print(calculate_sha256(file_path))
# Clean up the downloaded file
os.remove(file_path)
")

if [[ "$new_checksum" == "" ]]; then
echo "Error calculating checksum for $file_name. Skipping..."
return
fi

echo "Checksum for $file_name: $new_checksum"

# Compare and update the YAML file if checksums do not match
result=$?
if [[ $result -eq 2 ]]; then
echo "File not found, deleting entry for $file_name..."
# yq eval -i "del(.[$idx].files[] | select(.filename == \"$file_name\"))" "$input_yaml"
elif [[ "$old_checksum" != "$new_checksum" ]]; then
echo "Checksum mismatch for $file_name. Updating..."
yq eval -i "del(.[$idx].files[] | select(.filename == \"$file_name\").sha256)" "$input_yaml"
yq eval -i "(.[$idx].files[] | select(.filename == \"$file_name\")).sha256 = \"$new_checksum\"" "$input_yaml"
elif [[ $result -ne 0 ]]; then
echo "Error downloading file $file_name. Skipping..."
else
echo "Checksum match for $file_name. No update needed."
fi
}

# Read the YAML and process each file
len=$(yq eval '. | length' "$input_yaml")
for ((i=0; i<$len; i++))
do
name=$(yq eval ".[$i].name" "$input_yaml")
files_len=$(yq eval ".[$i].files | length" "$input_yaml")
for ((j=0; j<$files_len; j++))
do
filename=$(yq eval ".[$i].files[$j].filename" "$input_yaml")
uri=$(yq eval ".[$i].files[$j].uri" "$input_yaml")
checksum=$(yq eval ".[$i].files[$j].sha256" "$input_yaml")
echo "Checking model $name, file $filename. URI = $uri, Checksum = $checksum"
check_and_update_checksum "$name" "$filename" "$uri" "$checksum" "$i"
done
done
47 changes: 47 additions & 0 deletions .github/workflows/checksum_checker.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
name: Check if checksums are up-to-date
on:
schedule:
- cron: 0 20 * * *
workflow_dispatch:
jobs:
checksum_check:
runs-on: arc-runner-set
steps:
- name: Force Install GIT latest
run: |
sudo apt-get update \
&& sudo apt-get install -y software-properties-common \
&& sudo apt-get update \
&& sudo add-apt-repository -y ppa:git-core/ppa \
&& sudo apt-get update \
&& sudo apt-get install -y git
- uses: actions/checkout@v4
- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install -y pip wget
sudo pip install --upgrade pip
pip install huggingface_hub
- name: 'Setup yq'
uses: dcarbone/[email protected]
with:
version: 'v4.43.1'
download-compressed: true
force: true

- name: Checksum checker 🔧
run: |
export HF_HOME=/hf_cache
sudo mkdir /hf_cache
sudo chmod 777 /hf_cache
bash .github/checksum_checker.sh gallery/index.yaml
- name: Create Pull Request
uses: peter-evans/create-pull-request@v6
with:
token: ${{ secrets.UPDATE_BOT_TOKEN }}
push-to-fork: ci-forks/LocalAI
commit-message: ':arrow_up: Checksum updates in gallery/index.yaml'
title: 'models(gallery): :arrow_up: update checksum'
branch: "update/checksum"
body: Updating checksums in gallery/index.yaml
signoff: true
2 changes: 1 addition & 1 deletion .github/workflows/image-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ jobs:
tag-suffix: '-hipblas'
ffmpeg: 'false'
image-type: 'extras'
base-image: "rocm/dev-ubuntu-22.04:6.0-complete"
base-image: "rocm/dev-ubuntu-22.04:6.1"
grpc-base-image: "ubuntu:22.04"
runs-on: 'arc-runner-set'
makeflags: "--jobs=3 --output-sync=target"
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/image.yml
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ jobs:
ffmpeg: 'true'
image-type: 'extras'
aio: "-aio-gpu-hipblas"
base-image: "rocm/dev-ubuntu-22.04:6.0-complete"
base-image: "rocm/dev-ubuntu-22.04:6.1"
grpc-base-image: "ubuntu:22.04"
latest-image: 'latest-gpu-hipblas'
latest-image-aio: 'latest-aio-gpu-hipblas'
Expand All @@ -141,7 +141,7 @@ jobs:
tag-suffix: '-hipblas'
ffmpeg: 'false'
image-type: 'extras'
base-image: "rocm/dev-ubuntu-22.04:6.0-complete"
base-image: "rocm/dev-ubuntu-22.04:6.1"
grpc-base-image: "ubuntu:22.04"
runs-on: 'arc-runner-set'
makeflags: "--jobs=3 --output-sync=target"
Expand Down Expand Up @@ -218,7 +218,7 @@ jobs:
tag-suffix: '-hipblas-ffmpeg-core'
ffmpeg: 'true'
image-type: 'core'
base-image: "rocm/dev-ubuntu-22.04:6.0-complete"
base-image: "rocm/dev-ubuntu-22.04:6.1"
grpc-base-image: "ubuntu:22.04"
runs-on: 'arc-runner-set'
makeflags: "--jobs=3 --output-sync=target"
Expand All @@ -228,7 +228,7 @@ jobs:
tag-suffix: '-hipblas-core'
ffmpeg: 'false'
image-type: 'core'
base-image: "rocm/dev-ubuntu-22.04:6.0-complete"
base-image: "rocm/dev-ubuntu-22.04:6.1"
grpc-base-image: "ubuntu:22.04"
runs-on: 'arc-runner-set'
makeflags: "--jobs=3 --output-sync=target"
Expand Down
68 changes: 2 additions & 66 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,8 @@ jobs:
strategy:
matrix:
include:
- build: 'avx2'
- build: ''
defines: ''
- build: 'avx'
defines: '-DLLAMA_AVX2=OFF'
- build: 'avx512'
defines: '-DLLAMA_AVX512=ON'
- build: 'cuda12'
defines: ''
- build: 'cuda11'
Expand Down Expand Up @@ -74,7 +70,6 @@ jobs:
- name: Build
id: build
env:
CMAKE_ARGS: "${{ matrix.defines }}"
BUILD_ID: "${{ matrix.build }}"
run: |
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest
Expand Down Expand Up @@ -124,63 +119,7 @@ jobs:
name: stablediffusion
path: release/

build-macOS:
strategy:
matrix:
include:
- build: 'avx2'
defines: ''
- build: 'avx'
defines: '-DLLAMA_AVX2=OFF'
- build: 'avx512'
defines: '-DLLAMA_AVX512=ON'
runs-on: macOS-latest
steps:
- name: Clone
uses: actions/checkout@v4
with:
submodules: true
- uses: actions/setup-go@v5
with:
go-version: '1.21.x'
cache: false
- name: Dependencies
run: |
brew install protobuf grpc
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest
go install google.golang.org/protobuf/cmd/protoc-gen-go@latest
- name: Build
id: build
env:
CMAKE_ARGS: "${{ matrix.defines }}"
BUILD_ID: "${{ matrix.build }}"
run: |
export C_INCLUDE_PATH=/usr/local/include
export CPLUS_INCLUDE_PATH=/usr/local/include
export PATH=$PATH:$GOPATH/bin
make dist
- uses: actions/upload-artifact@v4
with:
name: LocalAI-MacOS-${{ matrix.build }}
path: release/
- name: Release
uses: softprops/action-gh-release@v2
if: startsWith(github.ref, 'refs/tags/')
with:
files: |
release/*


build-macOS-arm64:
strategy:
matrix:
include:
- build: 'avx2'
defines: ''
- build: 'avx'
defines: '-DLLAMA_AVX2=OFF'
- build: 'avx512'
defines: '-DLLAMA_AVX512=ON'
runs-on: macos-14
steps:
- name: Clone
Expand All @@ -198,17 +137,14 @@ jobs:
go install google.golang.org/protobuf/cmd/protoc-gen-go@latest
- name: Build
id: build
env:
CMAKE_ARGS: "${{ matrix.defines }}"
BUILD_ID: "${{ matrix.build }}"
run: |
export C_INCLUDE_PATH=/usr/local/include
export CPLUS_INCLUDE_PATH=/usr/local/include
export PATH=$PATH:$GOPATH/bin
make dist
- uses: actions/upload-artifact@v4
with:
name: LocalAI-MacOS-arm64-${{ matrix.build }}
name: LocalAI-MacOS-arm64
path: release/
- name: Release
uses: softprops/action-gh-release@v2
Expand Down
Loading
Loading