Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Persistent bug with many models #6571

Open
1 task done
elfsovereign opened this issue Dec 12, 2024 · 3 comments
Open
1 task done

Persistent bug with many models #6571

elfsovereign opened this issue Dec 12, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@elfsovereign
Copy link

Describe the bug

So I've had an issue where whenever I submit a prompt, seemingly regardless of the model, it fails to generate a prompt, and can even kill the program. For reference, I'm running a RTX 2060 with 6GB of VRAM, and 16 GB of regular ram, and an intel I3-2100F processor. I know it's not the most beefcake computer ever, but I'm trying to run 7b, 3b, and 1b models (I've been trying different settings and different sizes just to see if the bug responds to lower powered models). I'm primarily looking for text generation. It seems that only GGUF models are actually working, but I don't know if that's a thing with Oobabooga overall, or a community turning away from other models, or what. I'm sadly a novice with programming and stuff, and just doing the best I can. Here's the error it prints:

 File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\queueing.py", line 527, in process_events

response = await route_utils.call_process_api(

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\route_utils.py", line 261, in call_process_api

output = await app.get_blocks().process_api(

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1786, in process_api

result = await self.call_function(

^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1338, in call_function

prediction = await anyio.to_thread.run_sync(

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync

return await get_async_backend().run_sync_in_worker_thread(

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 2505, in run_sync_in_worker_thread

return await future

^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 1005, in run

result = context.run(func, *args)

^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\utils.py", line 759, in wrapper

response = f(*args, **kwargs)

^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\chat.py", line 1141, in handle_character_menu_change

html = redraw_html(history, state['name1'], state['name2'], state['mode'], state['chat_style'], state['character_menu'])

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\chat.py", line 490, in redraw_html

return chat_html_wrapper(history, name1, name2, mode, style, character, reset_cache=reset_cache)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\html_generator.py", line 326, in chat_html_wrapper

return generate_cai_chat_html(history['visible'], name1, name2, style, character, reset_cache)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\html_generator.py", line 250, in generate_cai_chat_html

row = [convert_to_markdown_wrapped(entry, use_cache=i != len(history) - 1) for entry in _row]

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\html_generator.py", line 250, in <listcomp>

row = [convert_to_markdown_wrapped(entry, use_cache=i != len(history) - 1) for entry in _row]

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\html_generator.py", line 172, in convert_to_markdown_wrapped

return convert_to_markdown.__wrapped__(string)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\modules\html_generator.py", line 78, in convert_to_markdown

string = re.sub(pattern, replacement, string, flags=re.MULTILINE)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "E:\text-generation-webui-main\installer_files\env\Lib\re\__init__.py", line 185, in sub

return _compile(pattern, flags).sub(repl, string, count)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

TypeError: expected string or bytes-like object, got 'NoneType' 

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

I've tried a variety of different models, and it's always an error like this (I believe it's the identical one).

Screenshot

No response

Logs

N/A

System Info

RTX 2060 with 6GB of VRAM, and 16 GB of regular ram, and an intel I3-2100F processor
@elfsovereign elfsovereign added the bug Something isn't working label Dec 12, 2024
@elfsovereign
Copy link
Author

Oh I rather dumbly forgot to say that I wager, given my novice-nature to advanced software like this I'd not be surprised if it was just a dumb move on my part, but I'd love to know how to fix it.

@SobakaFox
Copy link

Ошибка TypeError: expected string or bytes-like object, got 'NoneType' говорит о том, что в какой-то момент код пытается обработать объект, который равен None, в функции, ожидающей строку. Эта ошибка возникает в модуле html_generator.py, где вызывается re.sub с переменной string, которая оказывается равной None.

@kalle07
Copy link

kalle07 commented Dec 12, 2024

please vote my bug to, its the same
#6563

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants