Fixed an issue where AI Reports fail with OpenAI models that do not s…#9725
Fixed an issue where AI Reports fail with OpenAI models that do not s…#9725dpage wants to merge 1 commit intopgadmin-org:masterfrom
Conversation
…upport the temperature parameter. pgadmin-org#9719 Removed the temperature parameter from all LLM provider clients and pipeline calls, allowing each model to use its default. This fixes compatibility with GPT-5-mini/nano and future models that don't support user-configurable temperature. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (7)
💤 Files with no reviewable changes (5)
WalkthroughThe changes remove the temperature parameter from the LLM client interface and all provider implementations (OpenAI, Anthropic, Ollama, Docker), plus from the report pipeline usage. A release note documents the fix for AI Reports failing with OpenAI models that don't support the temperature parameter. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~15 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…upport the temperature parameter. #9719
Removed the temperature parameter from all LLM provider clients and pipeline calls, allowing each model to use its default. This fixes compatibility with GPT-5-mini/nano and future models that don't support user-configurable temperature.
Summary by CodeRabbit