Skip to content

Commit

Permalink
Merge pull request #75 from KYLN24/main
Browse files Browse the repository at this point in the history
package lomo_optim
  • Loading branch information
KaiLv69 authored Mar 6, 2024
2 parents 5fab460 + 4a5c12f commit e23d04c
Show file tree
Hide file tree
Showing 8 changed files with 816 additions and 3 deletions.
39 changes: 39 additions & 0 deletions .github/workflows/python-publish.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# This workflow will upload a Python Package using Twine when a release is created
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python#publishing-to-package-registries

# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.

name: Upload Python Package

on:
release:
types: [published]

permissions:
contents: read

jobs:
deploy:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'
- name: Install the dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build and publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: |
python -m build --wheel
twine upload dist/*
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,5 @@
.idea
lomo_optim.egg-info
dist
build
__pycache__
20 changes: 18 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,23 @@
This is the implementation for [Full Parameter Fine-Tuning for Large Language Models with Limited Resources](https://arxiv.org/pdf/2306.09782.pdf)
and [AdaLomo: Low-memory Optimization with Adaptive Learning Rate](https://arxiv.org/pdf/2310.10195.pdf).

LOMO and AdaLomo are integrated in [CoLLiE](https://github.com/OpenLMLab/collie) library, which supports Collaborative Training of Large Language Models in an Efficient Way.
LOMO and AdaLomo are integrated in [CoLLiE](https://github.com/OpenMOSS/collie) library, which supports Collaborative Training of Large Language Models in an Efficient Way.
You can also install `lomo-optim` from PyPI using pip.

```bash
pip install lomo-optim
```

Then, import `Lomo` or `AdaLomo`.

```python
from lomo_optim import Lomo
from lomo_optim import AdaLomo
```

The usage of `Lomo` and `AdaLomo` is similar but not the same as PyTorch's optimizers
([example](https://github.com/OpenMOSS/CoLLiE/blob/726ec80d263c1e1c56344dfde5b3c24897daa94d/collie/controller/trainer.py#L469)).
We recommend to use `AdaLomo` without `gradnorm` to get better performance and higher throughput.

# LOMO: LOw-Memory Optimization

Expand Down Expand Up @@ -38,4 +54,4 @@ The code for AdaLomo is in [adalomo](adalomo) folder.
journal={arXiv preprint arXiv:2306.09782},
year={2023}
}
```
```
16 changes: 15 additions & 1 deletion README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,20 @@
论文 [Full Parameter Fine-Tuning for Large Language Models with Limited Resources](https://arxiv.org/pdf/2306.09782.pdf)[AdaLomo: Low-memory Optimization with Adaptive Learning Rate](https://arxiv.org/pdf/2310.10195.pdf) 的实现.

LOMO和AdaLomo已经集成到了 [CoLLiE](https://github.com/OpenLMLab/collie) (Collaborative Training of Large Language Models in an Efficient Way) 中。
也可以使用 pip 从 PyPI 安装 `lomo-optim` 包。

```bash
pip install lomo-optim
```

然后,从 `lomo_optim` 中导入 `Lomo``AdaLomo`

```python
from lomo_optim import Lomo
from lomo_optim import AdaLomo
```
`Lomo``AdaLomo`的使用方法与PyTorch的优化器类似,但不完全相同([示例](https://github.com/OpenMOSS/CoLLiE/blob/726ec80d263c1e1c56344dfde5b3c24897daa94d/collie/controller/trainer.py#L469))。
推荐使用`AdaLomo`并且不加`gradnorm`来获得更好的性能同时维持更高的吞吐量。

# LOMO: LOw-Memory Optimization

Expand Down Expand Up @@ -38,4 +52,4 @@ AdaLomo的代码在 [adalomo](adalomo) 文件夹中。
journal={arXiv preprint arXiv:2306.09782},
year={2023}
}
```
```
5 changes: 5 additions & 0 deletions lomo_optim/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
from .adalomo import AdaLomo
from .lomo import Lomo

__version__ = "0.1.0"
__all__ = ["Lomo", "AdaLomo"]
Loading

0 comments on commit e23d04c

Please sign in to comment.