Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
344ed52
feat(web): add posthog-backed adoption signals and demote PyPI to sec…
abrichr Mar 5, 2026
7de9012
Improve usage metric auto-mapping and add one-line install UX
abrichr Mar 5, 2026
984544d
Auto-resolve PostHog project ID when API key is present
abrichr Mar 5, 2026
30d4c5c
Handle phc project tokens with explicit metrics auth guidance
abrichr Mar 5, 2026
c282bb1
Default PostHog project ID and document env setup
abrichr Mar 5, 2026
9e3a7b6
chore: trigger netlify preview after env update
abrichr Mar 5, 2026
21b992e
Treat reachable PostHog source as available even at zero volume
abrichr Mar 5, 2026
d5b705a
fix(metrics): avoid false zeros when PostHog volume fields are unavai…
abrichr Mar 5, 2026
0a71bfa
perf(ui): add cached adoption metrics UX and faster API fetch path
abrichr Mar 5, 2026
4568183
feat(metrics): improve loading UX and use ecosystem GitHub totals
abrichr Mar 5, 2026
6ef4346
fix(ui): ignore stale unconfigured adoption-metrics cache
abrichr Mar 5, 2026
f630581
fix(copy+layout): remove secondary label and align adoption panel width
abrichr Mar 5, 2026
6dc3823
feat(adoption): share all-time/90-day window across adoption and PyPI
abrichr Mar 6, 2026
868c704
refine(adoption): reduce duplicate secondary metrics and add telemetr…
abrichr Mar 6, 2026
07de4a2
fix(adoption): compute coverage start from canonical telemetry events
abrichr Mar 6, 2026
a85e716
fix: handle transient posthog failures without false not-configured s…
abrichr Mar 6, 2026
4788109
feat: emphasize telemetry coverage window in adoption cards
abrichr Mar 6, 2026
f504147
chore: move adoption signals below pypi chart
abrichr Mar 6, 2026
a291d5e
fix: stabilize adoption metrics cache and clarify telemetry card labels
abrichr Mar 6, 2026
c2a5fd4
fix: align coverage date with strict telemetry event counters
abrichr Mar 6, 2026
4759a28
refactor: focus adoption panel on product telemetry metrics
abrichr Mar 6, 2026
aa33ba6
tweak: consolidate early-data notice and keep telemetry cards on one row
abrichr Mar 6, 2026
83c18c0
feat(metrics): add total-events card and transparent telemetry breakd…
abrichr Mar 7, 2026
4397d77
Add telemetry transparency panel and metadata contract
abrichr Mar 7, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Server-side PostHog credentials for /api/project-metrics
POSTHOG_HOST=https://us.i.posthog.com
POSTHOG_PROJECT_ID=68185

# Required for reading event definitions and 30d volumes.
# NOTE: this must be a PERSONAL API key (phx_), not the project token (phc_).
# POSTHOG_PERSONAL_API_KEY=phx_your_personal_api_key
105 changes: 105 additions & 0 deletions METRICS_SIGNAL_GUIDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
# Homepage Metrics Signal Guide

## Goal
Show credibility metrics that reflect real usage, not only package installs.

## Metric hierarchy
1. **Primary**: GitHub traction + product usage metrics
2. **Secondary**: PyPI download trends

PyPI remains useful, but should not be the lead signal by itself.

## Implemented API
`GET /api/project-metrics`

Returns:
- `github`: stars/forks/watchers/issues
- `usage`: total events + demos/runs/actions (30d/90d/all-time when available), source metadata, caveats
- `warnings`: non-fatal fetch warnings

## Usage metric sources
### Source A: PostHog (preferred)
If server env vars are set:
- `POSTHOG_PERSONAL_API_KEY` (or `POSTHOG_API_KEY`)
- optional `POSTHOG_PROJECT_ID` (defaults to `68185`, auto-resolved from API key if omitted)
- optional `POSTHOG_HOST` (defaults to `https://us.posthog.com`)

The API reads PostHog `event_definitions` and sums `volume_30_day` for matched events.

### Source B: explicit overrides (fallback/manual)
Use:
- `OPENADAPT_METRIC_DEMOS_RECORDED_30D`
- `OPENADAPT_METRIC_AGENT_RUNS_30D`
- `OPENADAPT_METRIC_GUI_ACTIONS_30D`
- `OPENADAPT_METRIC_TOTAL_EVENTS_30D`
- `OPENADAPT_METRIC_TOTAL_EVENTS_90D`
- `OPENADAPT_METRIC_TOTAL_EVENTS_ALL_TIME`
- `OPENADAPT_METRIC_APPS_AUTOMATED`

## UI behavior
- The telemetry panel always shows **Total Events** first.
- Detailed breakdown cards (Demos / Agent Runs / GUI Actions) are shown when telemetry has enough depth.
- Current unlock gate:
- at least `100` total events in last 90 days, and
- at least `14` days of telemetry coverage (if coverage date is available).

## Current event-name mapping (exact-first)
### Demos
- `recording_finished`
- `recording_completed`
- `demo_recorded`
- `demo_completed`
- `recording.stopped`
- `recording.saved`
- `record.stopped`
- `capture.completed`
- `capture.saved`

### Runs
- `automation_run`
- `agent_run`
- `benchmark_run`
- `replay_started`
- `episode_started`
- `replay.started`

### Actions
- `action_executed`
- `step_executed`
- `mouse_click`
- `keyboard_input`
- `ui_action`
- `action_triggered`

Ignored low-signal events:
- `function_trace`
- `get_events.started`
- `get_events.completed`
- `visualize.started`
- `visualize.completed`

If exact mappings return zero for a category, the API automatically falls back to guarded pattern matching. This keeps counters populated for new event families (for example `command:*` / `operation:*`) without relying on env overrides.

## Tradeoffs
### PostHog event_definitions approach (exact-first + fallback)
Pros:
- no client auth exposure
- lightweight implementation
- easy to keep server-cached
- uses real event names already emitted by OpenAdapt repos

Cons:
- still depends on naming consistency for best precision
- fallback regex can overcount if external events use similar names

### Env override approach
Pros:
- explicit and deterministic
- useful before PostHog is fully instrumented

Cons:
- manual updates needed
- risk of stale numbers without process discipline

## Recommended next step
Standardize new instrumentation to preserve one of the existing exact-name families to minimize reliance on fallback matching.
113 changes: 113 additions & 0 deletions TELEMETRY_TRANSPARENCY_DESIGN.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# Telemetry Transparency Panel Design

Date: 2026-03-06
Scope: `openadapt-web` adoption telemetry section (`/` homepage)

## Goal
Make telemetry metrics auditable by users without adding modal friction or inaccurate copy.

Users should be able to answer:
1. What exactly do these counters represent?
2. Which event names are included/excluded?
3. What data fields are collected in telemetry events?
4. How to opt out?

## Constraints
- Must be safe for preview branches/Netlify deploy previews.
- Must avoid "copy drift" where docs/UI differ from code.
- Must keep the homepage readable on mobile.
- Must not expose secrets or raw event payloads.

## Options Considered

### Option A: Modal dialog ("Learn what we collect")
Pros:
- High visual focus
- Plenty of space for detail

Cons:
- Extra click + context switch
- Worse mobile ergonomics
- More complexity (focus traps, keyboard handling)

### Option B: Inline expandable panel (`details/summary`) under telemetry cards
Pros:
- Low friction, transparent by default location
- Native accessibility semantics
- Easy to keep synchronized with API payload

Cons:
- Adds vertical length to the section
- Slightly less "spotlight" than modal

### Option C: Separate telemetry transparency page
Pros:
- Max room for detail and diagrams
- Better long-form documentation

Cons:
- Easy to ignore
- Higher drift risk if disconnected from live classification constants

## Decision
Use **Option B** (inline expandable panel) and generate panel content from `/api/project-metrics` response metadata.

Rationale:
- Lowest UX friction
- Strongest anti-drift path when metadata is API-backed from server constants
- Works well in preview and production

## Proposed Implementation

### 1) API metadata contract
Add `usage.transparency` object to `/api/project-metrics` response:
- `classification_version`
- `dashboard_scope`:
- this UI uses aggregate counts (not raw event payload display)
- event names + timestamps + counts for metrics
- `included_event_names` (demos/runs/actions exact lists)
- `ignored_event_names`
- `ignored_event_patterns`
- `fallback_patterns` (for each category)
- `collection_fields`:
- common event envelope fields (e.g., category, timestamp)
- common tags (internal, package, package_version, os, python_version, ci)
- common operation fields (command/operation/success/duration/item_count)
- `privacy_controls`:
- explicit never-collect list
- scrubbing behaviors
- opt-out env vars (`DO_NOT_TRACK`, `OPENADAPT_TELEMETRY_ENABLED`)
- `source_links`:
- privacy policy
- telemetry source docs

### 2) UI panel
In `AdoptionSignals`:
- Add collapsible section titled `Telemetry details (what we collect)`.
- Render sections:
- Metric counting scope
- Included event names
- Excluded event names/patterns
- Collected fields/tags
- Privacy controls + opt-out
- Source links
- Keep panel below metric cards and status messages.

### 3) Safety / trust language
- Avoid claiming raw payload access on the web page.
- Clearly separate:
- "Data used for these counters"
- "Telemetry client fields in OpenAdapt instrumentation"

## Tradeoffs
- More UI density, but transparent and auditable.
- Some duplication with privacy policy, but scoped to metrics and backed by API constants.
- Requires periodic updates if telemetry schema changes; mitigated by API-backed metadata contract.

## Preview-Branch Plan
Roll out in PR preview first. Validate:
1. Readability on desktop + mobile
2. No contradiction with `/privacy-policy`
3. Metadata stays aligned with event classification constants

Then decide whether to merge to main.
Loading