Skip to content

Commit e611f5b

Browse files
authored
Merge pull request #14 from madebygps/update-model-gpt-4.1-mini
Switch default model from gpt-5-mini to gpt-4.1-mini
2 parents c4b93cb + 956e57d commit e611f5b

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

51 files changed

+102
-102
lines changed

.env.sample

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ AZURE_OPENAI_CHAT_DEPLOYMENT=YOUR-AZURE-DEPLOYMENT-NAME
77
OPENAI_API_KEY=YOUR-OPENAI-KEY
88
OPENAI_MODEL=gpt-3.5-turbo
99
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
10-
GITHUB_MODEL=gpt-5-mini
10+
GITHUB_MODEL=gpt-4.1-mini
1111
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
1212
# Configure for Redis (used by agent_history_redis.py, defaults to dev container Redis):
1313
REDIS_URL=redis://localhost:6379

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,13 +117,13 @@ If you want to run the scripts locally, you need to set up the `GITHUB_TOKEN` en
117117
export GITHUB_TOKEN=your_personal_access_token
118118
```
119119
120-
10. Optionally, you can use a model other than "gpt-5-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-5-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
120+
10. Optionally, you can use a model other than "gpt-4.1-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-4.1-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
121121
122122
## Using Azure AI Foundry models
123123
124124
You can run all examples in this repository using GitHub Models. If you want to run the examples using models from Azure AI Foundry instead, you need to provision the Azure AI resources, which will incur costs.
125125
126-
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-5-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
126+
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-4.1-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
127127
128128
1. Make sure the [Azure Developer CLI (azd)](https://aka.ms/install-azd) is installed.
129129

examples/agent_basic.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,11 +24,11 @@
2424
client = OpenAIChatClient(
2525
base_url="https://models.github.ai/inference",
2626
api_key=os.environ["GITHUB_TOKEN"],
27-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
27+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
2828
)
2929
else:
3030
client = OpenAIChatClient(
31-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
31+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
3232
)
3333

3434
agent = Agent(client=client, instructions="You're an informational agent. Answer questions cheerfully.")

examples/agent_evaluation.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,22 +48,22 @@
4848
client = OpenAIChatClient(
4949
base_url="https://models.github.ai/inference",
5050
api_key=os.environ["GITHUB_TOKEN"],
51-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
51+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
5252
)
5353
eval_model_config = OpenAIModelConfiguration(
5454
type="openai",
5555
base_url="https://models.github.ai/inference",
5656
api_key=os.environ["GITHUB_TOKEN"],
57-
model="openai/gpt-5-mini",
57+
model="openai/gpt-4.1-mini",
5858
)
5959
else:
6060
client = OpenAIChatClient(
61-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
61+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
6262
)
6363
eval_model_config = OpenAIModelConfiguration(
6464
type="openai",
6565
api_key=os.environ["OPENAI_API_KEY"],
66-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
66+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
6767
)
6868

6969

examples/agent_evaluation_batch.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,13 +46,13 @@
4646
type="openai",
4747
base_url="https://models.github.ai/inference",
4848
api_key=os.environ["GITHUB_TOKEN"],
49-
model="openai/gpt-5-mini",
49+
model="openai/gpt-4.1-mini",
5050
)
5151
else:
5252
model_config = OpenAIModelConfiguration(
5353
type="openai",
5454
api_key=os.environ["OPENAI_API_KEY"],
55-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
55+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
5656
)
5757

5858
# Optional: Set AZURE_AI_PROJECT in .env to log results to Azure AI Foundry.

examples/agent_evaluation_generate.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,11 +45,11 @@
4545
client = OpenAIChatClient(
4646
base_url="https://models.github.ai/inference",
4747
api_key=os.environ["GITHUB_TOKEN"],
48-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
48+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4949
)
5050
else:
5151
client = OpenAIChatClient(
52-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
52+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
5353
)
5454

5555

examples/agent_history_redis.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@
3838
client = OpenAIChatClient(
3939
base_url="https://models.github.ai/inference",
4040
api_key=os.environ["GITHUB_TOKEN"],
41-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
41+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4242
)
4343
else:
4444
client = OpenAIChatClient(
45-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
45+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
4646
)
4747

4848

examples/agent_history_sqlite.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@
3838
client = OpenAIChatClient(
3939
base_url="https://models.github.ai/inference",
4040
api_key=os.environ["GITHUB_TOKEN"],
41-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
41+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4242
)
4343
else:
4444
client = OpenAIChatClient(
45-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
45+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
4646
)
4747

4848

examples/agent_knowledge_aisearch.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -75,11 +75,11 @@
7575
client = OpenAIChatClient(
7676
base_url="https://models.github.ai/inference",
7777
api_key=os.environ["GITHUB_TOKEN"],
78-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
78+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
7979
)
8080
else:
8181
client = OpenAIChatClient(
82-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
82+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
8383
)
8484

8585
# ── Azure AI Search context provider ─────────────────────────────────

examples/agent_knowledge_pg.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@
7878
chat_client = OpenAIChatClient(
7979
base_url="https://models.github.ai/inference",
8080
api_key=os.environ["GITHUB_TOKEN"],
81-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
81+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
8282
)
8383
embed_client = OpenAI(
8484
base_url="https://models.github.ai/inference",
@@ -87,7 +87,7 @@
8787
embed_model = "text-embedding-3-small"
8888
else:
8989
chat_client = OpenAIChatClient(
90-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
90+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
9191
)
9292
embed_client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
9393
embed_model = "text-embedding-3-small"

0 commit comments

Comments
 (0)