Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

chat: fix issues with quickly switching between multiple chats #2343

Merged
merged 12 commits into from
May 15, 2024

Conversation

cebtenzzre
Copy link
Member

@cebtenzzre cebtenzzre commented May 13, 2024

If you switch between chats too quickly, it can result in the UI getting into inconsistent states (e.g. a progress bar stuck at 60%) and sometimes seemingly deadlocking (which happens during the new "Waiting for model" status).

To fix UI inconsistencies:

  • Move trySwitchContextInProgress, modelLoadingPercentage, and isCurrentlyLoading properties from the window to the chat so the load status always reflects the current chat
  • Initialize load status at the beginning of loadModel instead of only in setModelInfo - reloadModel calls this too
  • If loadModel is cancelled, quickly return instead of trying to fall back and reporting an "invalid model" error
  • On load failure, instead of showing only the bottom "reload" button, show neither

To prevent quickly switching chats from queueing many slow operations:

  • Skip context switch if an unload is pending
  • Skip unnecessary calls to LLModel::saveState

UI enhancement:

  • Show "waiting for model" as a separate state when switching context, so it's clearer what is taking so long

Other bugfix:

  • Use std::unique_ptr in the LLModelStore and fix a related memory leak - not absolutely necessary for this PR, but it makes some other changes slightly simpler

Non-functional changes:

  • Use std::optional to store zero or one items in the LLModelStore
  • Remove m_shouldTrySwitchContext, which didn't do anything useful, and simplify the related code

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
@cebtenzzre cebtenzzre force-pushed the fix-chat-switching branch 2 times, most recently from f3cbd09 to bcfcb9c Compare May 14, 2024 17:59
@cebtenzzre cebtenzzre marked this pull request as ready for review May 14, 2024 17:59
@cebtenzzre cebtenzzre requested a review from manyoso May 14, 2024 17:59
@cebtenzzre cebtenzzre marked this pull request as draft May 14, 2024 18:26
@cebtenzzre cebtenzzre marked this pull request as ready for review May 14, 2024 19:11
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
There is no reason for reloadModel to show slighly different status
information from setModelInfo.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
This fixes a memory leak if there was a model in the store on exit, e.g.
if the user loads a model and then switches to a chat associated with a
different model file without loading it.

ChatLLM::destroyStore is added to fix a heap-use-after-free caused by
the unique_ptr being freed too late. Global destructors are hard.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
When switching context, this state can take a significant amount of
time. Separate it out so the user is less likely to think GPT4All is
completely stuck.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
This aligns its behavior with the upper "reload" button.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
The boolean flag doesn't do anything useful.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
@cebtenzzre cebtenzzre changed the title chat: trying to fix issues with quickly switching chats chat: fix issues with quickly switching between multiple chats May 14, 2024
@manyoso
Copy link
Collaborator

manyoso commented May 15, 2024

This is tricky code. Let's do a huddle for review on this one.

@cebtenzzre cebtenzzre merged commit 7e1e00f into main May 15, 2024
6 of 10 checks passed
@cebtenzzre cebtenzzre deleted the fix-chat-switching branch May 15, 2024 18:07
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants