Skip to content

feat(pi): add azure subprovider support with base_url config#921

Merged
christso merged 12 commits intomainfrom
feat/pi-azure-subprovider
Apr 2, 2026
Merged

feat(pi): add azure subprovider support with base_url config#921
christso merged 12 commits intomainfrom
feat/pi-azure-subprovider

Conversation

@christso
Copy link
Copy Markdown
Collaborator

@christso christso commented Apr 2, 2026

Summary

  • Add azure-openai-responses as a supported subprovider for pi-coding-agent and pi-cli providers via the short alias subprovider: azure
  • Add base_url field to both pi provider configs, enabling full Azure OpenAI configuration via targets.yaml without requiring ambient env vars
  • Extract shared pi-provider-aliases module to deduplicate provider-to-env-var mapping (ENV_KEY_MAP, ENV_BASE_URL_MAP, resolveSubprovider)

Example target config

- name: pi-azure
  provider: pi-coding-agent
  subprovider: azure
  base_url: ${{ AZURE_OPENAI_ENDPOINT }}
  model: ${{ AZURE_DEPLOYMENT_NAME }}
  api_key: ${{ AZURE_OPENAI_API_KEY }}
  grader_target: grader
  tools: read,bash,edit,write

Test plan

  • Unit tests for resolveSubprovider alias resolution (azure → azure-openai-responses)
  • Unit tests for ENV_KEY_MAP and ENV_BASE_URL_MAP mappings
  • Integration tests for pi-coding-agent and pi-cli azure target resolution in targets.test.ts
  • Build, typecheck, lint all pass
  • Manual: run eval with AGENT_TARGET=pi-azure against Azure OpenAI endpoint

🤖 Generated with Claude Code

christso and others added 3 commits April 3, 2026 00:11
Add azure-openai-responses as a supported subprovider for pi-coding-agent
and pi-cli providers. Users can now configure Azure OpenAI entirely via
targets.yaml using `subprovider: azure` (short alias) plus `base_url`,
`model`, and `api_key` fields.

- Extract shared pi-provider-aliases module (ENV_KEY_MAP, ENV_BASE_URL_MAP,
  resolveSubprovider) to deduplicate provider-to-env-var mapping
- Add baseUrl field to PiCodingAgentResolvedConfig and PiCliResolvedConfig
- Resolve base_url/baseUrl/endpoint from target YAML into the config
- Set AZURE_OPENAI_BASE_URL env var from config so the pi-ai SDK picks it up

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… fix

- Add `azure-v1` alias for Azure /v1 endpoints that use the standard
  OpenAI client (resolves to `openai-responses`, avoids AzureOpenAI
  api-version conflicts)
- Construct fallback model descriptor when getModel returns undefined,
  enabling Azure deployment names not in the pi-ai registry
- Set model.baseUrl from config so the SDK uses the correct endpoint
- Call registerBuiltInApiProviders() to ensure streaming providers are
  available
- Fix pi-cli spawn on Windows with `shell: process.platform === 'win32'`
- Add `azure-v1` to PROVIDER_OWN_PREFIXES for correct env var stripping

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The function exists at runtime in pi-ai but is not in the type definitions.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@cloudflare-workers-and-pages
Copy link
Copy Markdown

cloudflare-workers-and-pages bot commented Apr 2, 2026

Deploying agentv with  Cloudflare Pages  Cloudflare Pages

Latest commit: 7ea8628
Status:⚡️  Build in progress...

View logs

christso and others added 7 commits April 3, 2026 07:45
- Resolve npm .cmd wrappers to underlying node scripts on Windows,
  avoiding shell escaping issues with prompt content containing
  special characters (<, *, -, etc.)
- Add resolveCliProvider() for pi-cli --provider flag mapping
  (azure-v1 → openai, azure → azure-openai-responses)
- Pi CLI uses different provider names than the SDK, so the CLI
  and SDK alias maps are separate

Verified e2e: pi-cli-azure target successfully connects to Azure
/v1 endpoint with gpt-5.4-mini, all 3 eval tests execute without
errors.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Import accessSync and readFileSync from node:fs
- Fix import formatting for biome

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Instead of requiring users to know about azure-v1 vs azure, the
subprovider: azure now automatically uses the OpenAI-compatible client
when base_url is provided. This means azure /v1 endpoints work out of
the box with just:

  subprovider: azure
  base_url: ${{ AZURE_OPENAI_ENDPOINT }}

When base_url is absent, falls back to AzureOpenAI client with
AZURE_OPENAI_RESOURCE_NAME.

- Replace static alias maps with base_url-aware resolve functions
- Add resolveEnvKeyName/resolveEnvBaseUrlName helpers
- Update PROVIDER_OWN_PREFIXES to use resolved CLI provider name
- Update targets.yaml to use AZURE_OPENAI_ENDPOINT directly
- Comprehensive tests for all resolve functions with/without base_url

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Pi-cli azure now uses the native azure-openai-responses provider with
AZURE_OPENAI_RESOURCE_NAME (extracted from base_url). This means
subprovider: azure works out of the box — no azure-v1 alias needed.

- extractAzureResourceName() parses resource name from URLs or passes
  through raw names
- Pi-cli buildEnv sets AZURE_OPENAI_API_KEY + AZURE_OPENAI_RESOURCE_NAME
- resolveCliProvider always maps azure → azure-openai-responses
- SDK path unchanged: azure + base_url → openai-responses (v1 compat)
- Add azure-smoke.eval.yaml for e2e testing

Verified e2e: pi CLI connects to Azure, model returns correct answers
(Paris, 2+2=4). Response parsing is a separate pre-existing issue.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…r azure

- resolveWindowsCmd now uses `where` to find the full path of the .cmd
  wrapper, fixing resolution when the executable isn't in CWD
- Skip --api-key flag for azure subprovider since the key is passed via
  AZURE_OPENAI_API_KEY env var (--api-key sets the wrong provider's key)

Note: pi-cli stdout capture on Windows is broken due to a Bun runtime
bug (Bun.spawn/child_process.spawn can't capture stdout from node
child processes on Windows). This affects all pi-cli evals on Windows,
not just azure. The azure connectivity itself is verified working — the
pi CLI returns correct responses (Paris, 4) but Bun can't read them.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Some providers (azure-openai-responses) may emit text content in
message_update events but leave the agent_end assistant message with
empty content. Fall back to the last message_end event with non-empty
content.

Also adds azure-smoke.eval.yaml for e2e connectivity testing.

Note: pi-cli azure evals return empty assistant content due to the
pi-coding-agent CLI's azure-openai-responses streaming not populating
the agent_end messages array. This is an upstream pi-ai issue.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
christso and others added 2 commits April 3, 2026 10:29
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@christso christso merged commit 60c4962 into main Apr 2, 2026
3 of 4 checks passed
@christso christso deleted the feat/pi-azure-subprovider branch April 2, 2026 23:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant