Python: Add MLflow AI Gateway provider samples#5507
Python: Add MLflow AI Gateway provider samples#5507PattaraS wants to merge 3 commits intomicrosoft:mainfrom
Conversation
Adds a new providers/mlflow_gateway/ directory with an example showing how to route Agent Framework requests through MLflow AI Gateway's OpenAI-compatible endpoint using the existing OpenAIChatClient. Follows the same pattern as ollama_with_openai_chat_client.py — no new dependencies required beyond the OpenAI integration.
There was a problem hiding this comment.
Pull request overview
Adds a new Python provider sample demonstrating how to route Agent Framework OpenAIChatClient traffic through MLflow AI Gateway’s OpenAI-compatible endpoint, plus documentation and index entry updates.
Changes:
- Added an MLflow AI Gateway provider sample with both streaming and non-streaming agent runs (including tool calling).
- Added a provider-specific README describing MLflow AI Gateway setup and required environment variables.
- Updated the providers index README to include the new
mlflow_gateway/folder.
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| python/samples/02-agents/providers/mlflow_gateway/mlflow_gateway_with_openai_chat_client.py | New sample showing OpenAIChatClient(base_url=...) configured for MLflow AI Gateway with streaming/non-streaming runs and a simple tool. |
| python/samples/02-agents/providers/mlflow_gateway/README.md | New documentation for prerequisites, configuration, and usage of the MLflow AI Gateway sample. |
| python/samples/02-agents/providers/README.md | Adds mlflow_gateway/ entry to the provider samples index table. |
| | File | Description | | ||
| |------|-------------| | ||
| | [`mlflow_gateway_with_openai_chat_client.py`](mlflow_gateway_with_openai_chat_client.py) | Connect an Agent Framework agent to MLflow AI Gateway via the OpenAI-compatible endpoint. Shows both streaming and non-streaming responses with tool calling. | | ||
|
|
There was a problem hiding this comment.
The examples table is malformed markdown (each row starts with ||), so it won’t render as a 2‑column table. Use standard table syntax with a single leading | per row (matching other provider READMEs).
| _client = OpenAIChatClient( | ||
| api_key="unused", # Provider keys are managed by the MLflow server | ||
| base_url=os.getenv("MLFLOW_GATEWAY_ENDPOINT"), | ||
| model=os.getenv("MLFLOW_GATEWAY_MODEL"), | ||
| ) |
There was a problem hiding this comment.
base_url comes from os.getenv("MLFLOW_GATEWAY_ENDPOINT"); if it’s unset (or empty), OpenAIChatClient will fall back to the default OpenAI base URL and may send prompts to OpenAI unexpectedly. Consider validating MLFLOW_GATEWAY_ENDPOINT (and MLFLOW_GATEWAY_MODEL) up front and failing fast with a clear message before constructing the client.
| _client = OpenAIChatClient( | ||
| api_key="unused", # Provider keys are managed by the MLflow server | ||
| base_url=os.getenv("MLFLOW_GATEWAY_ENDPOINT"), | ||
| model=os.getenv("MLFLOW_GATEWAY_MODEL"), | ||
| ) |
There was a problem hiding this comment.
Same as above: if MLFLOW_GATEWAY_ENDPOINT isn’t set, the client may default to OpenAI’s public endpoint. Validate required env vars once (e.g., in main()) and pass the resolved values into both examples to avoid accidental misrouting.
Address Copilot review feedback: without explicit validation, an unset or empty MLFLOW_GATEWAY_ENDPOINT would cause OpenAIChatClient to silently fall back to OpenAI's public endpoint and forward prompts there. Validate both env vars in main() and pass the resolved values into both example functions, failing fast with a clear error message.
|
Thanks for the review! Pushed 48ac8c9:
|
Motivation
MLflow AI Gateway (MLflow ≥ 3.0) is an open-source, database-backed LLM proxy that provides a unified API across 20+ providers (OpenAI, Anthropic, Gemini, Mistral, Bedrock, Ollama, etc.) with encrypted secrets management, fallback/retry, traffic splitting, and budget tracking.
Agent Framework users currently have no documented path for using MLflow AI Gateway as their LLM backend. This PR adds a sample to fill that gap.
Description
Adds
python/samples/02-agents/providers/mlflow_gateway/with:mlflow_gateway_with_openai_chat_client.py— Example showing how to route Agent Framework requests through MLflow AI Gateway's OpenAI-compatible endpoint using the existingOpenAIChatClientwith a custombase_url. Includes both streaming and non-streaming examples with tool calling.README.md— Setup instructions (install MLflow, start the server, create a gateway endpoint), environment variable configuration, and feature overview.The pattern follows the established
ollama_with_openai_chat_client.pyexample — no new dependencies are required beyond the existing OpenAI integration.Also updates
python/samples/02-agents/providers/README.mdto add the new folder to the index table.Checklist
ollama/,openai/, etc.)providers/README.mdindexAI Disclosure
This pull request was AI-assisted by Claude. All content was reviewed and validated by a human contributor.