Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accidentally deleted my chats, less threads and less user-friendly UI #109

Open
IndustrialOne opened this issue Sep 2, 2024 · 0 comments

Comments

@IndustrialOne
Copy link

IndustrialOne commented Sep 2, 2024

I upgraded from 0.2.31 to 0.3.2 and there are several issues that make it feel like a downgrade.

  1. The garbage can icon for deleting chats should appear when you hover over them like before and there should be a prompt. Having to click the three dots and then delete every time is unintuitive and caused me to delete my whole Migrated chats folder when I intended to delete the latest chat above it.

I recovered the files with an UNELETE app but they're either corrupted or I'm doing something wrong because they all appear as Unknown chats when I fire up LM studio. That's the jsons in the conversation folder. In the chats folder, the contents seem preserved when I open them in text editor but copying these to the conversation folder also did nothing.
How do I restore these to how they were before my mistake?

  1. It no longer shows the token count when I type or copy-paste stuff into the chat. You guys need to bring this back, it's crucial for me to know how close to the context window I am when I copy-paste an entire novel/post/document. Sure, it shows the token count AFTER each prompt but I need to know how much the initial paste will be.

  2. Why does it only allow me to select half the threads my hardware is capable of using? I have a 6-core CPU with hyperthreading and the settings now only allow me to select a maximum of 3 when I could select 6 before. The performance didn't take a huge hit but I notice increased lag between the responses.

  3. Why does the app continue processing after it finishes a response to my prompt? I try to press send after typing my response but it says I have to wait for it to finish despite that it did finish. I can press "stop generating" and do it but this seems to cause a greater delay when the LLM responds next. The previous version of LM Studio I used did not do this.

  4. Why can we no longer save our settings as a profile? It's annoying having to update the system prompt for every single new chat I start.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant