Hi Prism team
I’d like to request a feature to allow passing arbitrary/custom request parameters to provider API calls, starting with OpenAI Responses API fields like prompt_cache_key, but not limited to that.
Right now, provider handlers expose a fixed set of mapped options. This makes it hard to use newly released provider params (or less-common ones) without patching vendor code.
What I’m asking for
Please extend withProviderOptions so it can accept any keys and pass them through to the provider request payload (while keeping current typed/mapped options intact).
In other words: withProviderOptions should work as an escape hatch for provider-specific fields that Prism doesn’t yet model explicitly.
Ideally, this pattern should be available across all providers, not only OpenAI.
Why this matters
- Providers ship new parameters faster than SDK abstractions can expose them
- Teams can adopt provider features immediately without waiting for Prism releases
- Avoids local forks/vendor patches
- Keeps Prism flexible while preserving a clean default API
OpenAI concrete example
I need to send fields like:
prompt_cache_key
safety_identifier
prompt_cache_retention
- and potentially other new/experimental params for POST /responses
Reference:
https://developers.openai.com/api/reference/resources/responses/methods/create#responses_create-prompt_cache_key
Possible API shape
Prism::text()
->using('openai', 'gpt-5')
->withProviderOptions([
// existing Prism-known options
'service_tier' => 'auto',
'truncation' => 'disabled',
// arbitrary pass-through keys (new behavior)
'prompt_cache_key' => 'my-cache-key',
'some_future_provider_field' => 'value',
]);
Expected behavior:
- Prism keeps handling known options as today
- Unknown keys in withProviderOptions are forwarded as-is to the provider payload
- Same principle applied consistently for other providers
If useful, I can help by proposing a PR with tests for OpenAI first and then extending the same pattern to other providers.
Thanks for considering this!
Hi Prism team
I’d like to request a feature to allow passing arbitrary/custom request parameters to provider API calls, starting with OpenAI Responses API fields like
prompt_cache_key, but not limited to that.Right now, provider handlers expose a fixed set of mapped options. This makes it hard to use newly released provider params (or less-common ones) without patching vendor code.
What I’m asking for
Please extend
withProviderOptionsso it can accept any keys and pass them through to the provider request payload (while keeping current typed/mapped options intact).In other words:
withProviderOptionsshould work as an escape hatch for provider-specific fields that Prism doesn’t yet model explicitly.Ideally, this pattern should be available across all providers, not only OpenAI.
Why this matters
OpenAI concrete example
I need to send fields like:
prompt_cache_keysafety_identifierprompt_cache_retentionReference:
https://developers.openai.com/api/reference/resources/responses/methods/create#responses_create-prompt_cache_key
Possible API shape
Expected behavior:
If useful, I can help by proposing a PR with tests for OpenAI first and then extending the same pattern to other providers.
Thanks for considering this!