Open
Conversation
Remove hardcoded provider map (anthropic/openai) and instead query the running llmspy pod's providers.json for env var names and available providers. This means any provider llmspy supports (zai, deepseek, google, mistral, etc.) works with `obol model setup` without code changes.
Extract pure functions (parseProviderEnvKey, parseAvailableProviders, buildProviderStatus, patchLLMsJSON) from kubectl-calling wrappers so they can be tested without a running cluster. 23 test cases covering parsing, status cross-referencing, JSON patching, and edge cases.
0ed33f4 to
687f05e
Compare
Adds TestIntegration_ZaiInference that exercises a provider NOT in the old hardcoded map, proving zero-code-change provider support. Uses glm-4-flash via llmspy routing with ZHIPU_API_KEY from .env.
Contributor
|
will merge this in after a 0.3.1 tag, don't want to add more problems to critical path again today |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Remove the hardcoded
providerEnvKeysmap (anthropic/openai only) and instead query the running llmspy pod at runtime for provider metadata. This means any provider llmspy supports (zai, deepseek, google, mistral, groq, etc.) works withobol model setupwithout code changes.Before
Adding a new provider required editing Go code, rebuilding, and releasing.
After
Provider info is discovered dynamically by executing a Python script inside the llmspy pod that reads
providers.json— the same file llmspy itself uses. The CLI becomes a thin layer over llmspy's own provider registry.What changed
internal/model/model.goproviderEnvKeyshardcoded mapgetProviderEnvKey()— queries the llmspy pod viakubectl execfor a single provider's env var nameGetAvailableProviders()— returns all providers that accept API keys, discovered from the running podConfigureLLMSpy()to use dynamic env var lookup instead of the hardcoded mapGetProviderStatus()to cross-reference dynamic providers with ConfigMap and Secret stateparseProviderEnvKey,parseAvailableProviders,buildProviderStatus,patchLLMsJSON) from kubectl-calling wrappers for testabilitycmd/obol/model.gopromptModelConfig()now acceptscfgand queries llmspy for available providers dynamicallyobol model statustable widened to accommodate longer provider namesRun 'obol model setup' to configure a provider.internal/model/model_test.go(new)TestParseProviderEnvKey— 5 cases (valid output, whitespace, empty, unknown)TestParseAvailableProviders— 6 cases (empty, single, multiple, malformed lines)TestBuildProviderStatus— 6 cases (full status, ollama injection, missing env vars, invalid JSON)TestPatchLLMsJSON— 6 cases (enable, create new, idempotent, preserve fields, invalid JSON)How it works
Interactive mode (
obol model setupwith no flags) now shows all providers from llmspy:Test plan
go build ./cmd/obolcompiles cleanlygo test ./...passes (23 new tests + existing tests)obol model setup --provider=zai --api-key=<key>with running clusterobol model statusshows dynamically discovered providersobol model setup(interactive) lists all llmspy providers