Skip to content

Commit 980943f

Browse files
authored
Merge pull request #6 from madebygps/feature/otel-aspire-example
Feature/otel-aspire-example
2 parents 391a52b + 1349c86 commit 980943f

10 files changed

Lines changed: 482 additions & 8 deletions

File tree

.devcontainer/devcontainer.json

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
11
{
22
"name": "python-agentframework-demos",
3-
"build": {
4-
"dockerfile": "Dockerfile",
5-
"context": ".."
6-
},
3+
"dockerComposeFile": "docker-compose.yml",
4+
"service": "app",
5+
"workspaceFolder": "/workspaces/python-agentframework-demos",
76
"features": {
87
"ghcr.io/azure/azure-dev/azd:latest": {}
98
},
@@ -22,4 +21,4 @@
2221
}
2322
},
2423
"remoteUser": "vscode"
25-
}
24+
}

.devcontainer/docker-compose.yml

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
services:
2+
app:
3+
build:
4+
context: ..
5+
dockerfile: .devcontainer/Dockerfile
6+
volumes:
7+
- ..:/workspaces/python-agentframework-demos:cached
8+
command: sleep infinity
9+
environment:
10+
- OTEL_EXPORTER_OTLP_ENDPOINT=http://aspire-dashboard:18889
11+
12+
aspire-dashboard:
13+
image: mcr.microsoft.com/dotnet/aspire-dashboard:latest
14+
ports:
15+
- "18888:18888"
16+
environment:
17+
- DASHBOARD__FRONTEND__AUTHMODE=Unsecured

.env.sample

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,3 +9,4 @@ OPENAI_MODEL=gpt-3.5-turbo
99
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
1010
GITHUB_MODEL=gpt-5-mini
1111
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
12+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317

.github/prompts/review_pr_comments.prompt.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -133,4 +133,4 @@ The thread ID starts with `PRRT_` and can be found in the GraphQL query response
133133
Note: This skill can be removed once the GitHub MCP server has added built-in support for replying to PR review comments and resolving threads.
134134
See:
135135
https://github.com/github/github-mcp-server/issues/1323
136-
https://github.com/github/github-mcp-server/issues/1768
136+
https://github.com/github/github-mcp-server/issues/1768

README.md

Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -173,6 +173,64 @@ You can run the examples in this repository by executing the scripts in the `exa
173173
| [agent_mcp_local.py](examples/agent_mcp_local.py) | An agent connected to a local MCP server (e.g. for expense logging). |
174174
| [openai_tool_calling.py](examples/openai_tool_calling.py) | Tool calling with the low-level OpenAI SDK, showing manual tool dispatch. |
175175
| [workflow_basic.py](examples/workflow_basic.py) | A workflow-based agent. |
176+
| [agent_otel_aspire.py](examples/agent_otel_aspire.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to the [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
177+
178+
## Using the Aspire Dashboard for telemetry
179+
180+
The [agent_otel_aspire.py](examples/agent_otel_aspire.py) example can export OpenTelemetry traces, metrics, and structured logs to a [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).
181+
182+
### In GitHub Codespaces / Dev Containers
183+
184+
The Aspire Dashboard runs automatically as a service alongside the dev container. No extra setup is needed.
185+
186+
1. The `OTEL_EXPORTER_OTLP_ENDPOINT` environment variable is already set by the dev container.
187+
188+
2. Run the example:
189+
190+
```sh
191+
uv run agent_otel_aspire.py
192+
```
193+
194+
3. Open the dashboard at <http://localhost:18888> and explore:
195+
196+
* **Traces**: See the full span tree — agent invocation → chat completion → tool execution
197+
* **Metrics**: View token usage and operation duration histograms
198+
* **Structured Logs**: Browse conversation messages (system, user, assistant, tool)
199+
* **GenAI visualizer**: Select a chat completion span to see the rendered conversation
200+
201+
### Local environment (without Dev Containers)
202+
203+
If you're running locally without Dev Containers, you need to start the Aspire Dashboard manually:
204+
205+
1. Start the Aspire Dashboard:
206+
207+
```sh
208+
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard \
209+
-e DASHBOARD__FRONTEND__AUTHMODE=Unsecured \
210+
mcr.microsoft.com/dotnet/aspire-dashboard:latest
211+
```
212+
213+
2. Add the OTLP endpoint to your `.env` file:
214+
215+
```sh
216+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
217+
```
218+
219+
3. Run the example:
220+
221+
```sh
222+
uv run agent_otel_aspire.py
223+
```
224+
225+
4. Open the dashboard at <http://localhost:18888> and explore.
226+
227+
5. When done, stop the dashboard:
228+
229+
```shell
230+
docker stop aspire-dashboard
231+
```
232+
233+
For the full Python + Aspire guide, see [Use the Aspire dashboard with Python apps](https://aspire.dev/dashboard/standalone-for-python/).
176234

177235
## Resources
178236

examples/agent_otel_aspire.py

Lines changed: 101 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,101 @@
1+
import asyncio
2+
import logging
3+
import os
4+
import random
5+
from datetime import datetime, timezone
6+
from typing import Annotated
7+
8+
from agent_framework import ChatAgent
9+
from agent_framework.observability import configure_otel_providers
10+
from agent_framework.openai import OpenAIChatClient
11+
from azure.identity.aio import DefaultAzureCredential, get_bearer_token_provider
12+
from dotenv import load_dotenv
13+
from pydantic import Field
14+
from rich import print
15+
from rich.logging import RichHandler
16+
17+
# Setup logging
18+
handler = RichHandler(show_path=False, rich_tracebacks=True, show_level=False)
19+
logging.basicConfig(level=logging.WARNING, handlers=[handler], force=True, format="%(message)s")
20+
logger = logging.getLogger(__name__)
21+
logger.setLevel(logging.INFO)
22+
23+
# Configure OpenTelemetry export to the Aspire Dashboard (if endpoint is set)
24+
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
25+
if otlp_endpoint:
26+
os.environ.setdefault("OTEL_EXPORTER_OTLP_PROTOCOL", "grpc")
27+
os.environ.setdefault("OTEL_SERVICE_NAME", "agent-framework-demo")
28+
configure_otel_providers(enable_sensitive_data=True)
29+
logger.info(f"OpenTelemetry export enabled — sending to {otlp_endpoint}")
30+
else:
31+
logger.info(
32+
"Set OTEL_EXPORTER_OTLP_ENDPOINT in .env to export telemetry to the Aspire Dashboard. "
33+
"Use http://aspire-dashboard:18889 in Codespaces/Dev Containers or http://localhost:4317 locally."
34+
)
35+
36+
# Configure OpenAI client based on environment
37+
load_dotenv(override=True)
38+
API_HOST = os.getenv("API_HOST", "github")
39+
40+
async_credential = None
41+
if API_HOST == "azure":
42+
async_credential = DefaultAzureCredential()
43+
token_provider = get_bearer_token_provider(async_credential, "https://cognitiveservices.azure.com/.default")
44+
client = OpenAIChatClient(
45+
base_url=f"{os.environ['AZURE_OPENAI_ENDPOINT']}/openai/v1/",
46+
api_key=token_provider,
47+
model_id=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
48+
)
49+
elif API_HOST == "github":
50+
client = OpenAIChatClient(
51+
base_url="https://models.github.ai/inference",
52+
api_key=os.environ["GITHUB_TOKEN"],
53+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
54+
)
55+
else:
56+
client = OpenAIChatClient(
57+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
58+
)
59+
60+
61+
def get_weather(
62+
city: Annotated[str, Field(description="City name, spelled out fully")],
63+
) -> dict:
64+
"""Returns weather data for a given city, a dictionary with temperature and description."""
65+
logger.info(f"Getting weather for {city}")
66+
weather_options = [
67+
{"temperature": 72, "description": "Sunny"},
68+
{"temperature": 60, "description": "Rainy"},
69+
{"temperature": 55, "description": "Cloudy"},
70+
{"temperature": 45, "description": "Windy"},
71+
]
72+
return random.choice(weather_options)
73+
74+
75+
def get_current_time(
76+
timezone_name: Annotated[str, Field(description="Timezone name, e.g. 'US/Eastern', 'Asia/Tokyo', 'UTC'")],
77+
) -> str:
78+
"""Returns the current date and time in UTC (timezone_name is for display context only)."""
79+
logger.info(f"Getting current time for {timezone_name}")
80+
now = datetime.now(timezone.utc)
81+
return f"The current time in {timezone_name} is approximately {now.strftime('%Y-%m-%d %H:%M:%S')} UTC"
82+
83+
84+
agent = ChatAgent(
85+
name="weather-time-agent",
86+
chat_client=client,
87+
instructions="You are a helpful assistant that can look up weather and time information.",
88+
tools=[get_weather, get_current_time],
89+
)
90+
91+
92+
async def main():
93+
response = await agent.run("What's the weather in Seattle and what time is it in Tokyo?")
94+
print(response.text)
95+
96+
if async_credential:
97+
await async_credential.close()
98+
99+
100+
if __name__ == "__main__":
101+
asyncio.run(main())

examples/spanish/README.md

Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -174,6 +174,64 @@ Puedes ejecutar los ejemplos en este repositorio ejecutando los scripts en el di
174174
| [agent_mcp_local.py](agent_mcp_local.py) | Un agente conectado a un servidor MCP local (ej. para registro de gastos). |
175175
| [openai_tool_calling.py](openai_tool_calling.py) | Llamadas a funciones con el SDK de OpenAI de bajo nivel, mostrando despacho manual de herramientas. |
176176
| [workflow_basic.py](workflow_basic.py) | Usa Agent Framework para crear un agente basado en flujo de trabajo. |
177+
| [agent_otel_aspire.py](agent_otel_aspire.py) | Un agente con trazas, métricas y logs estructurados de OpenTelemetry exportados al [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
178+
179+
## Usar el Aspire Dashboard para telemetría
180+
181+
El ejemplo [agent_otel_aspire.py](agent_otel_aspire.py) puede exportar trazas, métricas y logs estructurados de OpenTelemetry a un [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).
182+
183+
### En GitHub Codespaces / Dev Containers
184+
185+
El Aspire Dashboard se ejecuta automaticamente como un servicio junto al dev container. No necesitas configuracion adicional.
186+
187+
1. La variable de entorno `OTEL_EXPORTER_OTLP_ENDPOINT` ya esta configurada por el dev container.
188+
189+
2. Ejecuta el ejemplo:
190+
191+
```sh
192+
uv run agent_otel_aspire.py
193+
```
194+
195+
3. Abre el dashboard en <http://localhost:18888> y explora:
196+
197+
* **Traces**: Ve el arbol completo de spans — invocacion del agente → completado del chat → ejecucion de herramientas
198+
* **Metrics**: Consulta histogramas de uso de tokens y duracion de operaciones
199+
* **Structured Logs**: Navega los mensajes de la conversacion (sistema, usuario, asistente, herramienta)
200+
* **Visualizador GenAI**: Selecciona un span de completado del chat para ver la conversacion renderizada
201+
202+
### Entorno local (sin Dev Containers)
203+
204+
Si ejecutas localmente sin Dev Containers, necesitas iniciar el Aspire Dashboard manualmente:
205+
206+
1. Inicia el Aspire Dashboard:
207+
208+
```sh
209+
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard \
210+
-e DASHBOARD__FRONTEND__AUTHMODE=Unsecured \
211+
mcr.microsoft.com/dotnet/aspire-dashboard:latest
212+
```
213+
214+
2. Agrega el endpoint OTLP a tu archivo `.env`:
215+
216+
```sh
217+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
218+
```
219+
220+
3. Ejecuta el ejemplo:
221+
222+
```sh
223+
uv run agent_otel_aspire.py
224+
```
225+
226+
4. Abre el dashboard en <http://localhost:18888> y explora.
227+
228+
5. Cuando termines, deten el dashboard:
229+
230+
```sh
231+
docker stop aspire-dashboard
232+
```
233+
234+
Para la guia completa de Python + Aspire, consulta [Usar el Aspire Dashboard con apps de Python](https://aspire.dev/dashboard/standalone-for-python/).
177235

178236
## Recursos
179237

Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,103 @@
1+
import asyncio
2+
import logging
3+
import os
4+
import random
5+
from datetime import datetime, timezone
6+
from typing import Annotated
7+
8+
from agent_framework import ChatAgent
9+
from agent_framework.observability import configure_otel_providers
10+
from agent_framework.openai import OpenAIChatClient
11+
from azure.identity.aio import DefaultAzureCredential, get_bearer_token_provider
12+
from dotenv import load_dotenv
13+
from pydantic import Field
14+
from rich import print
15+
from rich.logging import RichHandler
16+
17+
# Configura logging
18+
handler = RichHandler(show_path=False, rich_tracebacks=True, show_level=False)
19+
logging.basicConfig(level=logging.WARNING, handlers=[handler], force=True, format="%(message)s")
20+
logger = logging.getLogger(__name__)
21+
logger.setLevel(logging.INFO)
22+
23+
# Configura la exportación de OpenTelemetry al Aspire Dashboard (si el endpoint está configurado)
24+
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
25+
if otlp_endpoint:
26+
os.environ.setdefault("OTEL_EXPORTER_OTLP_PROTOCOL", "grpc")
27+
os.environ.setdefault("OTEL_SERVICE_NAME", "agent-framework-demo")
28+
configure_otel_providers(enable_sensitive_data=True)
29+
logger.info(f"Exportación OpenTelemetry habilitada — enviando a {otlp_endpoint}")
30+
else:
31+
logger.info(
32+
"Configura OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 en .env"
33+
" para exportar telemetría al Aspire Dashboard"
34+
)
35+
36+
# Configura el cliente para usar Azure OpenAI, GitHub Models u OpenAI
37+
load_dotenv(override=True)
38+
API_HOST = os.getenv("API_HOST", "github")
39+
40+
async_credential = None
41+
if API_HOST == "azure":
42+
async_credential = DefaultAzureCredential()
43+
token_provider = get_bearer_token_provider(async_credential, "https://cognitiveservices.azure.com/.default")
44+
client = OpenAIChatClient(
45+
base_url=f"{os.environ['AZURE_OPENAI_ENDPOINT']}/openai/v1/",
46+
api_key=token_provider,
47+
model_id=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
48+
)
49+
elif API_HOST == "github":
50+
client = OpenAIChatClient(
51+
base_url="https://models.github.ai/inference",
52+
api_key=os.environ["GITHUB_TOKEN"],
53+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
54+
)
55+
else:
56+
client = OpenAIChatClient(
57+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
58+
)
59+
60+
61+
def get_weather(
62+
city: Annotated[str, Field(description="City name, spelled out fully")],
63+
) -> dict:
64+
"""Devuelve datos meteorológicos para una ciudad: temperatura y descripción."""
65+
logger.info(f"Obteniendo el clima para {city}")
66+
weather_options = [
67+
{"temperature": 22, "description": "Soleado"},
68+
{"temperature": 15, "description": "Lluvioso"},
69+
{"temperature": 13, "description": "Nublado"},
70+
{"temperature": 7, "description": "Ventoso"},
71+
]
72+
return random.choice(weather_options)
73+
74+
75+
def get_current_time(
76+
timezone_name: Annotated[
77+
str, Field(description="Timezone name, e.g. 'US/Eastern', 'America/Mexico_City', 'UTC'")
78+
],
79+
) -> str:
80+
"""Devuelve la fecha y hora actual en UTC (timezone_name es solo para contexto de visualización)."""
81+
logger.info(f"Obteniendo la hora actual para {timezone_name}")
82+
now = datetime.now(timezone.utc)
83+
return f"La hora actual en {timezone_name} es aproximadamente {now.strftime('%Y-%m-%d %H:%M:%S')} UTC"
84+
85+
86+
agent = ChatAgent(
87+
name="weather-time-agent",
88+
chat_client=client,
89+
instructions="Eres un asistente útil que puede consultar información del clima y la hora.",
90+
tools=[get_weather, get_current_time],
91+
)
92+
93+
94+
async def main():
95+
response = await agent.run("¿Cómo está el clima en Ciudad de México y qué hora es en Buenos Aires?")
96+
print(response.text)
97+
98+
if async_credential:
99+
await async_credential.close()
100+
101+
102+
if __name__ == "__main__":
103+
asyncio.run(main())

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ dependencies = [
1313
"aiohttp",
1414
"faker",
1515
"fastmcp",
16+
"opentelemetry-exporter-otlp-proto-grpc",
1617
"agent-framework-core @ git+https://github.com/microsoft/agent-framework.git@98cd72839e4057d661a58092a3b013993264d834#subdirectory=python/packages/core",
1718
"agent-framework-devui @ git+https://github.com/microsoft/agent-framework.git@98cd72839e4057d661a58092a3b013993264d834#subdirectory=python/packages/devui",
1819
]

0 commit comments

Comments
 (0)