Symptom
Using the chat feature in the TUI with no models downloaded causes a crash. Sending any message (e.g., "Hi Bob") crashes the app instead of showing a graceful error.
Expected Behavior
The chat screen should:
- Check if any models are available before allowing chat
- If no models are downloaded, show a clear message (e.g., "No models available. Download a model first.")
- If a model request fails (404, connection refused, etc.), display the error in the chat area without crashing
Suggested Fix
- Add error handling around the Ollama chat API call
- Check for available models on chat screen mount — disable input or show a prompt if none exist
- Wrap the streaming response handler in try/except to catch connection errors, model-not-found, etc.
Symptom
Using the chat feature in the TUI with no models downloaded causes a crash. Sending any message (e.g., "Hi Bob") crashes the app instead of showing a graceful error.
Expected Behavior
The chat screen should:
Suggested Fix