feat: make DEFAULT_SYSTEM_PROMPT configurable via environment variable #36
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
System prompt was hardcoded in
src/rag.pyline 27. While tenant-level override worked, the global default was not configurable.Changes
src/config.py: Adddefault_system_promptfield toLLMConfig(env:LLM_DEFAULT_SYSTEM_PROMPT)src/rag.py: Read fromconfig.llm.default_system_promptinstead of hardcoded stringsrc/multi_tenant.py: Use config value as fallback for constructor parameterenv.example: DocumentLLM_DEFAULT_SYSTEM_PROMPToptionUsage
Default value unchanged:
"You are a helpful assistant. Provide direct answers without showing your reasoning process."Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.