Fix async peer LLM token lookup, duplicate token bug, and improve async message view#1162
Conversation
…, and fixed the LLM peer lookup so it can actually retrieve API keys at call time
fix dark mode for dropdown menu in course home and for chapter naviga…
|
This also fixes issue #612 Fixes issue where the dropdown menu on the course home and course navigation showed up as white on white text. |
bnmnetp
left a comment
There was a problem hiding this comment.
How are you forcing the course home page to be in dark mode?
I can see the problem you describe on RST built book pages where you turn on dark mode, but I believe that we have variables defined to use with the different modes. RST books are deprecated but I'm happy to merge PRs for them by others.
I would prefer this be done as a separate PR from the LLM token lookup. It makes it easier to track and easier to back out a merge if needed.
…r navigation" This reverts commit eca2ee1.
|
Thanks for the fix!! I didn't expect that the custom option would be used (ever??!!) so I'm curious why you used this instead of openai? |
This PR fixes a few issues with the async peer LLM feature and cleans up the instructor view.
There was a duplicate token issue on the add token screen, the custom provider name input happened to share the
token-inputCSS class with the actual key field, so the submit handler was collecting it as a second token every time. The renamed class fixes this.Fixed it so all messages from a student show up in order instead of just their last one in async peer when there is no llm. Also the LLM prompt has been updated.