Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help wanted! How to fix the ModuleNotFoundError: No module named 'utils.callbacks #4

Open
kanseaveg opened this issue Jul 3, 2024 · 10 comments

Comments

@kanseaveg
Copy link

kanseaveg commented Jul 3, 2024

(knowla) [@gpu002 knowla]$ export PYTHONPATH=.
(knowla) [@gpu002 knowla]$ python test_csqa.py  --base_model='llama2_knowla_base_version/llama2_7B'  --is_KG=True  --lora_weights="./llama2-lora-cn"  --dataset="siqa"

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
Traceback (most recent call last):
  File "/public/home/code/20240619/knowla/test_csqa.py", line 15, in <module>
    from utils.callbacks import Iteratorize, Stream
ModuleNotFoundError: No module named 'utils.callbacks

How to fix this problem? thank you.

@luoxindi
Copy link
Collaborator

luoxindi commented Jul 3, 2024

Thank you for your feedback. Please comment out that line of code.

@kanseaveg
Copy link
Author

kanseaveg commented Jul 4, 2024

(knowla) [@gpu001 knowla_lora]$ python test_csqa.py \
>     --base_model='./llama2_7B' \
>     --is_KG=True \
>     --lora_weights="./official_provided/llama2-lora-r16" \
>     --dataset="siqa"

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
```shell

/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
Traceback (most recent call last):
  File "/public/home/code/20240704/knowla_lora/test_csqa.py", line 15, in <module>
    from utils.callbacks import Iteratorize, Stream
ModuleNotFoundError: No module named 'utils.callbacks'

During the evaluation, I encountered this problem.

@kanseaveg
Copy link
Author

kanseaveg commented Jul 4, 2024

(knowla) [@gpu001 knowla_lora]$ python test_bbh.py    --base_model='./llama2_7B'     --is_KG=True     --lora_weights="./official_provided/llama2-lora-r16/llama2-lora"     

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: /public/home/env/miniconda/envs/knowla did not contain libcudart.so as expected! Searching further paths...
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('sys/dashboard/sys/remote')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('1;/public/spack-module/spack/modules/cascadelake/linux-centos7-broadwell'), PosixPath('1;/public/simg'), PosixPath('1'), PosixPath('1;/public/spack-module/spack/modules/cascadelake/linux-centos7-haswell')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/opt/apps/modulefiles')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('@/tmp/.ICE-unix/50346,unix/unix'), PosixPath('local/unix')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/var/www/ood/apps/sys/dashboard')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('-30}";\n for ((i=1; i<=time*2; i++))\n do\n port_used "${port}";\n port_status=$?;\n if [ "$port_status" == "0" ]; then\n return 0;\n else\n if [ "$port_status" == "127" ]; then\n echo "commands to find port were either not found or inaccessible.";\n echo "command options are lsof, nc, bash\'s /dev/tcp, or python (or python3) with socket lib.";\n return 127;\n fi;\n fi;\n sleep 0.5;\n done;\n return 1\n}'), PosixPath('() {  local port="${1}";\n local time="${2')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(' < /dev/tcp/$1/$2 ) > /dev/null 2>&1;\n else\n return 127;\n fi\n };\n function port_used () \n { \n local port="${1#*'), PosixPath('-30}";\n for ((i=1; i<=time*2; i++))\n do\n port_used "${port}";\n port_status=$?;\n if [ "$port_status" == "0" ]; then\n return 0;\n else\n if [ "$port_status" == "127" ]; then\n echo "commands to find port were either not found or inaccessible.";\n echo "command options are lsof, nc, bash\'s /dev/tcp, or python (or python3) with socket lib.";\n return 127;\n fi;\n fi;\n sleep 0.5;\n done;\n return 1\n };\n export -f wait_until_port_used;\n function create_passwd () \n { \n tr -cd \'a-zA-Z0-9\' < /dev/urandom 2> /dev/null | head -c${1'), PosixPath('${port}"; do\n port=$(random_number "${2'), PosixPath('() {  function random_number () \n { \n shuf -i ${1}-${2} -n 1\n };\n export -f random_number;\n function port_used_python () \n { \n python -c "import socket; socket.socket().connect((\'$1\',$2))" > /dev/null 2>&1\n };\n function port_used_python3 () \n { \n python3 -c "import socket; socket.socket().connect((\'$1\',$2))" > /dev/null 2>&1\n };\n function port_used_nc () \n { \n nc -w 2 "$1" "$2" < /dev/null > /dev/null 2>&1\n };\n function port_used_lsof () \n { \n lsof -i '), PosixPath('-2000}" "${3'), PosixPath('-65535}");\n done;\n echo "${port}"\n };\n export -f find_port;\n function wait_until_port_used () \n { \n local port="${1}";\n local time="${2'), PosixPath('\' || echo "localhost") | awk \'END{print $NF}\');\n local port_strategies=(port_used_nc port_used_lsof port_used_bash port_used_python port_used_python3);\n for strategy in ${port_strategies[@]};\n do\n $strategy $host $port;\n status=$?;\n if [[ "$status" == "0" ]] || [[ "$status" == "1" ]]; then\n return $status;\n fi;\n done;\n return 127\n };\n export -f port_used;\n function find_port () \n { \n local host="${1'), PosixPath('-65535}");\n while port_used "${host}'), PosixPath('-localhost}";\n local port=$(random_number "${2'), PosixPath('}";\n local host=$((expr "${1}" '), PosixPath('"$2" > /dev/null 2>&1\n };\n function port_used_bash () \n { \n local bash_supported=$(strings /bin/bash 2>/dev/null | grep tcp);\n if [ "$bash_supported" == "/dev/tcp/*/*" ]; then\n ( '), PosixPath(" '\\(.*\\)"), PosixPath('-8}\n };\n export -f create_passwd\n}')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('() {  eval $($LMOD_DIR/ml_cmd "$@")\n}')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath("() {  tr -cd 'a-zA-Z0-9' < /dev/urandom 2> /dev/null | head -c${1"), PosixPath('-8}\n}')}
  warn(msg)
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching /usr/local/cuda/lib64...
CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so
CUDA SETUP: Highest compute capability among GPUs detected: 8.0
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary /public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so...
Traceback (most recent call last):
  File "/public/home/code/20240704/knowla_lora/test_bbh.py", line 16, in <module>
    from utils.callbacks import Iteratorize, Stream
ModuleNotFoundError: No module named 'utils.callbacks'





(knowla) [@gpu001 knowla_lora]$ python test_triviaqa.py    --base_model='./llama2_7B'     --is_KG=True     --lora_weights="./official_provided/llama2-lora-r16/llama2-lora"     

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: /public/home/env/miniconda/envs/knowla did not contain libcudart.so as expected! Searching further paths...
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('sys/dashboard/sys/remote')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('1;/public/simg'), PosixPath('1;/public/spack-module/spack/modules/cascadelake/linux-centos7-haswell'), PosixPath('1;/public/spack-module/spack/modules/cascadelake/linux-centos7-broadwell'), PosixPath('1')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/opt/apps/modulefiles')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('@/tmp/.ICE-unix/50346,unix/unix'), PosixPath('local/unix')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/var/www/ood/apps/sys/dashboard')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('() {  local port="${1}";\n local time="${2'), PosixPath('-30}";\n for ((i=1; i<=time*2; i++))\n do\n port_used "${port}";\n port_status=$?;\n if [ "$port_status" == "0" ]; then\n return 0;\n else\n if [ "$port_status" == "127" ]; then\n echo "commands to find port were either not found or inaccessible.";\n echo "command options are lsof, nc, bash\'s /dev/tcp, or python (or python3) with socket lib.";\n return 127;\n fi;\n fi;\n sleep 0.5;\n done;\n return 1\n}')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('-30}";\n for ((i=1; i<=time*2; i++))\n do\n port_used "${port}";\n port_status=$?;\n if [ "$port_status" == "0" ]; then\n return 0;\n else\n if [ "$port_status" == "127" ]; then\n echo "commands to find port were either not found or inaccessible.";\n echo "command options are lsof, nc, bash\'s /dev/tcp, or python (or python3) with socket lib.";\n return 127;\n fi;\n fi;\n sleep 0.5;\n done;\n return 1\n };\n export -f wait_until_port_used;\n function create_passwd () \n { \n tr -cd \'a-zA-Z0-9\' < /dev/urandom 2> /dev/null | head -c${1'), PosixPath('-localhost}";\n local port=$(random_number "${2'), PosixPath('}";\n local host=$((expr "${1}" '), PosixPath(" '\\(.*\\)"), PosixPath('-8}\n };\n export -f create_passwd\n}'), PosixPath('\' || echo "localhost") | awk \'END{print $NF}\');\n local port_strategies=(port_used_nc port_used_lsof port_used_bash port_used_python port_used_python3);\n for strategy in ${port_strategies[@]};\n do\n $strategy $host $port;\n status=$?;\n if [[ "$status" == "0" ]] || [[ "$status" == "1" ]]; then\n return $status;\n fi;\n done;\n return 127\n };\n export -f port_used;\n function find_port () \n { \n local host="${1'), PosixPath(' < /dev/tcp/$1/$2 ) > /dev/null 2>&1;\n else\n return 127;\n fi\n };\n function port_used () \n { \n local port="${1#*'), PosixPath('"$2" > /dev/null 2>&1\n };\n function port_used_bash () \n { \n local bash_supported=$(strings /bin/bash 2>/dev/null | grep tcp);\n if [ "$bash_supported" == "/dev/tcp/*/*" ]; then\n ( '), PosixPath('-2000}" "${3'), PosixPath('-65535}");\n while port_used "${host}'), PosixPath('-65535}");\n done;\n echo "${port}"\n };\n export -f find_port;\n function wait_until_port_used () \n { \n local port="${1}";\n local time="${2'), PosixPath('() {  function random_number () \n { \n shuf -i ${1}-${2} -n 1\n };\n export -f random_number;\n function port_used_python () \n { \n python -c "import socket; socket.socket().connect((\'$1\',$2))" > /dev/null 2>&1\n };\n function port_used_python3 () \n { \n python3 -c "import socket; socket.socket().connect((\'$1\',$2))" > /dev/null 2>&1\n };\n function port_used_nc () \n { \n nc -w 2 "$1" "$2" < /dev/null > /dev/null 2>&1\n };\n function port_used_lsof () \n { \n lsof -i '), PosixPath('${port}"; do\n port=$(random_number "${2')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('() {  eval $($LMOD_DIR/ml_cmd "$@")\n}')}
  warn(msg)
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath("() {  tr -cd 'a-zA-Z0-9' < /dev/urandom 2> /dev/null | head -c${1"), PosixPath('-8}\n}')}
  warn(msg)
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching /usr/local/cuda/lib64...
CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so
CUDA SETUP: Highest compute capability among GPUs detected: 8.0
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary /public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so...
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fuzzywuzzy/fuzz.py:11: UserWarning: Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning
  warnings.warn('Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning')
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. If you see this, DO NOT PANIC! This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=True`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
True
07/04/2024 20:55:14 - INFO - accelerate.utils.modeling -   We will use 90% of the memory on device 0 for storing the model, and 10% for the buffer to avoid OOM. You can set `max_memory` in to a higher value to use more memory (at your own risk).
Loading checkpoint shards: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 2/2 [00:12<00:00,  6.30s/it]
Traceback (most recent call last):
  File "/public/home/code/20240704/knowla_lora/test_triviaqa.py", line 266, in <module>
    fire.Fire(main)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/public/home/code/20240704/knowla_lora/test_triviaqa.py", line 47, in main
    model = PeftModel.from_pretrained(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 271, in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 556, in load_adapter
    adapters_weights = torch.load(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 815, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 1033, in _legacy_load
    magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, 'v'.


When I attempted to use the other two scripts and applied the weight file r=16 provided in the README.md, an error still occurred, with the cause of the error unknown.

@kanseaveg
Copy link
Author

I was wondering if you've ever tried to reproduce results using the weight files you uploaded on Hugging Face? Alternatively, is there a script available that can successfully achieve this reproduction? Thank you.

@kanseaveg
Copy link
Author

kanseaveg commented Jul 4, 2024

(knowla) [@gpu001 knowla_lora]$ test_triviaqa.py
-bash: test_triviaqa.py: command not found
(knowla) [@gpu001 knowla_lora]$ python test_truthfulqa.py    --base_model='./llama2_7B'     --is_KG=True     --lora_weights="./official_provided/llama2-lora-r16/llama2-lora"     


===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fuzzywuzzy/fuzz.py:11: UserWarning: Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning
  warnings.warn('Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning')
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. If you see this, DO NOT PANIC! This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=True`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
True
Loading checkpoint shards: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 2/2 [00:10<00:00,  5.46s/it]
Traceback (most recent call last):
  File "/public/home/code/20240704/knowla_lora/test_truthfulqa.py", line 206, in <module>
    fire.Fire(main)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/public/home/code/20240704/knowla_lora/test_truthfulqa.py", line 48, in main
    model = PeftModel.from_pretrained(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 271, in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 556, in load_adapter
    adapters_weights = torch.load(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 815, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 1033, in _legacy_load
    magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, 'v'.
(knowla) [@gpu001 knowla_lora]$ 
(knowla) [@gpu001 knowla_lora]$ python test_truthfulqa.py    --base_model='./llama2_7B'     --is_KG=True     --lora_weights="./official_provided/llama2-lora-r16/llama2-lora"     


===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: Required library version not found: libsbitsandbytes_cpu.so. Maybe you need to compile it from source?
CUDA SETUP: Defaulting to libbitsandbytes_cpu.so...
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/bitsandbytes/cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fuzzywuzzy/fuzz.py:11: UserWarning: Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning
  warnings.warn('Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning')
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. If you see this, DO NOT PANIC! This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=True`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
True
Loading checkpoint shards: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 2/2 [00:08<00:00,  4.48s/it]
Traceback (most recent call last):
  File "/public/home/code/20240704/knowla_lora/test_truthfulqa.py", line 206, in <module>
    fire.Fire(main)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/public/home/code/20240704/knowla_lora/test_truthfulqa.py", line 48, in main
    model = PeftModel.from_pretrained(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 271, in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/peft/peft_model.py", line 556, in load_adapter
    adapters_weights = torch.load(
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 815, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/public/home/env/miniconda/envs/knowla/lib/python3.10/site-packages/torch/serialization.py", line 1033, in _legacy_load
    magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, 'v'.

I have tried all the scripts, and essentially, the issue lies with pickle_module.load.

@luoxindi
Copy link
Collaborator

luoxindi commented Jul 4, 2024

Thank you for your attention to our work. I am confident that my weight files are correct. While uploading the code, I removed some unrelated files to minimize the final code size, but there may be some dependency issues with the packages. I will review the script and upload a corrected version if necessary.

@kanseaveg
Copy link
Author

Okay, thank you for your generous explanation. I'd like to know how long the review process for the script typically takes?

@luoxindi
Copy link
Collaborator

luoxindi commented Jul 4, 2024

The script is correct and I can run it correctly. You should set the 'is_KG' to False if you want to run alpaca(r=16).
ttt

@kanseaveg
Copy link
Author

kanseaveg commented Jul 4, 2024

image

Hello, in the README.md file, it indicates that I must use is_KG=True. Could you please tell me if there's a misunderstanding on my part? Excuse me?

@kanseaveg
Copy link
Author

ubuntu@gpu16:/pzstor/OTHER-BASELINE/KnowLA$ python test_csqa.py     --base_model='./huggingface/models/llama2_knowla/llama2_7B'     --is_KG=True     --lora_weights="./llama2-lora-cn"     --dataset="siqa"
Traceback (most recent call last):
  File "/pzstor/OTHER-BASELINE/KnowLA/test_csqa.py", line 15, in <module>
    from utils.callbacks import Iteratorize, Stream
ModuleNotFoundError: No module named 'utils.callbacks'
(test-moss2) ubuntu@gpu16:/pzstor/OTHER-BASELINE/KnowLA$ python test_csqa.py     --base_model='./huggingface/models/llama2_knowla/llama2_7B'     --is_KG=False     --lora_weights="./llama2-lora-cn"     --dataset="siqa"
Traceback (most recent call last):
  File "/pzstor/OTHER-BASELINE/KnowLA/test_csqa.py", line 15, in <module>
    from utils.callbacks import Iteratorize, Stream
ModuleNotFoundError: No module named 'utils.callbacks'
ubuntu@gpu16:/pzstor/OTHER-BASELINE/KnowLA$ 

No matter it's true or false, your code exist a huge problems and be hard to reproduce.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants