Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Dec 15, 2025

System prompt was hardcoded in src/rag.py line 27. While tenant-level override worked, the global default was not configurable.

Changes

  • src/config.py: Add default_system_prompt field to LLMConfig (env: LLM_DEFAULT_SYSTEM_PROMPT)
  • src/rag.py: Read from config.llm.default_system_prompt instead of hardcoded string
  • src/multi_tenant.py: Use config value as fallback for constructor parameter
  • env.example: Document LLM_DEFAULT_SYSTEM_PROMPT option

Usage

# Override default system prompt (optional)
LLM_DEFAULT_SYSTEM_PROMPT="Your custom system prompt here"

Default value unchanged: "You are a helpful assistant. Provide direct answers without showing your reasoning process."

Original prompt

This section details on the original issue you should resolve

<issue_title>Default system prompt hardcoded</issue_title>
<issue_description>## 问题描述
系统提示词硬编码。

受影响的文件

  • src/rag.py 行 27

问题

虽然可通过租户配置覆盖,但全局默认值硬编码。

解决方案

应该配置:DEFAULT_SYSTEM_PROMPT 环境变量</issue_description>

Comments on the Issue (you are @copilot in this section)


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

- Add `default_system_prompt` field to LLMConfig in src/config.py
- Update src/rag.py to use config.llm.default_system_prompt
- Update src/multi_tenant.py to use config value as default
- Add LLM_DEFAULT_SYSTEM_PROMPT to env.example with documentation

Co-authored-by: BukeLy <19304666+BukeLy@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix hardcoded default system prompt in rag.py feat: make DEFAULT_SYSTEM_PROMPT configurable via environment variable Dec 15, 2025
Copilot AI requested a review from BukeLy December 15, 2025 09:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Default system prompt hardcoded

2 participants