How is your website ranking on ChatGPT?
Assistant Default Optimization Playbook: Win Choose Your AI Moments After Anthropic's Keep Thinking Launch
A step-by-step ADO and AEO plan to capture default status across browsers, OEM setup flows, and share sheets using verifiable trust proofs, signed capability feeds, and performance-backed co-marketing bundles.

Vicky
Sep 21, 2025
Why now
Consumer attention is shifting to assistant pickers and Ask AI entry points. Anthropic’s first broad consumer push on September 18, 2025 put assistants back in the spotlight with Anthropic's "Keep Thinking" launch. At the same time, browsers are adding AI buttons, OEMs are prompting assistant choice during setup, and platform share sheets are becoming gateways to route tasks. These are the new default battlegrounds.
Define the discipline
Assistant Default Optimization (ADO) is the systematic acquisition and retention of default status across choose your AI surfaces. It complements Assistant Engine Optimization (AEO), which ensures your assistant is eligible, indexed, and preferred when a surface routes queries across models.
Goal hierarchy
- Secure the default
- Sustain stickiness
- Win reroutes
- Grow cross-surface share of default
What to ship first
1) Machine verifiable trust proofs
Make trust machine-readable so browsers, OEMs, and app platforms can auto-validate your safety posture and eligibility.
- Safety policy VC: Publish your safety policy and red team scope as a W3C Verifiable Credential signed by your org and verifiable by partners. See the standard in W3C Verifiable Credentials 2.0.
- Eval badges: Attach independently reproducible scores from MLCommons AILuminate and HELM Capabilities. Issue them as verifiable credentials or Open Badges for easy display and API checks.
- Cryptographic attestations: Sign policy and eval artifacts with Sigstore and publish transparency proofs so partners can verify origin, timestamp, and integrity automatically.
Minimum viable artifact set
- safety.vc.jsonld — W3C VC describing policy scope, prohibited content, appeal flow
- evals.vc.jsonld — VC with benchmark identifiers and scores
- attestations/ — DSSE bundles and transparency proofs
- ai-discovery — ai.txt and robots.json equivalents to advertise feeds and eligibility
Example badge payloads
- AILuminate result summary as VC
- HELM Capabilities subset with scenario-level scores and model version
2) Structured capability feeds
Publish a signed JSON feed that assistant selectors can consume at setup, in share sheets, or inside an Ask AI button. Include eligibility flags, guardrail modes, and pricing so partners can route with confidence.
Suggested schema v0.3
{
"model": {"name": "Claude 3.7 Sonnet", "version": "2025-02-19", "modalities": ["text","image"], "license": "commercial"},
"capabilities": {"reasoning_modes": ["realtime","deliberate"], "tools": ["code","browse"], "context_tabs_supported": 10},
"performance": {"p50_latency_ms": 850, "p95_latency_ms": 2100, "throughput_rps": 15},
"limits": {"max_input_tokens": 150000, "max_output_tokens": 8000},
"safety": {"policy_ref": "/policy/safety.json", "age_modes": ["13+","18+"], "red_team_scope": ["self-harm","weapons","csam"]},
"evals": [
{"suite": "MLCommons AILuminate v1.0", "id": "ailuminate:1.0", "score": {"overall": "Very Good"}},
{"suite": "HELM Capabilities v1.0", "id": "helm-capabilities:v1.0", "score": {"aggregate": 0.674}}
],
"privacy": {"training_opt_in": true, "retention_days": 30, "consumer_terms_version": "2025-08-28"},
"pricing": {"usd_per_million_tokens_in": 3.00, "usd_per_million_tokens_out": 15.00},
"commerce": {"oem_revshare": 0.15, "trial_days": 90},
"eligibility": {"browsers": ["Chrome-US-Desktop"], "oems": ["Samsung-S25"], "app_surfaces": ["Android-ShareSheet","iOS-Siri-Extension"]},
"attestation": {"sigstore_bundle": "base64-DSSE", "rekor_uuid": "..."}
}
Notes
- Reference HELM and AILuminate identifiers for reproducibility.
- Keep privacy fields aligned with current consumer policy.
- Sign the feed. Provide rotating keys and a stable discovery URL in ai.txt.
3) Co-marketing bundles that convert brand into default
Turn above-the-line media and launches into measurable default picks.
-
Browser bundle
- Asset: Ask AI quick-pick tile plus a weeklong featured slot in the sidebar.
- Proofs required: safety.vc.jsonld, evals.vc.jsonld, signed capabilities feed.
- Offer: 90-day premium trial, usage cap unlock, and team trials tied to browser accounts.
- KPIs: default conversion rate from click, day-30 stickiness, assisted search reduction, p95 latency under load.
-
OEM onboarding bundle
- Asset: setup screen card, side-button long press option, keyboard row shortcut.
- Proofs required: offline fallback mode, on-device safety settings, transparent eval badges.
- Offer: device maker rev share on premium upgrades, customer care deflection credits, regional compliance kit.
- KPIs: setup attach rate, side-button activation rate, cross-device retention.
-
App share sheet bundle
- Asset: share-to-assistant target with multi-file context and template prompts.
- Proofs required: document handling policy VC, content deletion SLA, signed scopes.
- Offer: creator tier perks, storage credits, collaborative workspaces.
- KPIs: share sheet invocation rate, task completion, assisted creation sessions.
AEO eligibility checklist for new surfaces
Discovery
- Advertise capability feeds, eval badges, and policy endpoints in ai.txt and robots.json equivalents. For Chrome-specific surfaces, see our playbook on the Chrome Gemini Omnibox answer surface.
- Support structured outputs that let surfaces enforce JSON schema. For iOS on-screen scenarios, apply patterns from the iOS 26 visual intelligence AEO guide.
Safety
- Report AILuminate and HELM with seeds, commit hashes, and dates as verifiable claims.
- Document age gating modes and refusal policies as VC fields.
Integrity
- Provide Sigstore DSSE bundles, Rekor inclusion proofs, and rotating keys.
Privacy
- Declare retention windows, training opt-in status, and region-specific processing.
Commercials
- Share a SKU matrix for consumer, team, and enterprise plus rev share terms. For signed feed strategy that boosts ranking signals, review the Shopify Web Bot Auth for AEO.
Partner RFP packet
- One pager: value proposition, unit economics, regional coverage.
- Compliance folder: policy VC, DPIA summary, red team scope, incident response.
- Technical folder: capabilities.json, feed signing guide, rate limits, latency SLOs.
- Evidence folder: eval badges, attestation proofs, prior integration case studies.
Measurement framework
- Acquisition: share of default by surface, attach rate, opt-in rate.
- Quality: p95 latency, crash rate, safety violation rate per million requests.
- Retention: 7, 30, and 90-day stickiness, default churn to competitor.
- Incremental value: net new queries from Ask AI surfaces, reduction in web search leakage, premium conversion from default cohorts.
90-day execution plan
- Weeks 1 to 2: finalize safety policy VC, sign with Sigstore, and backfill historical evals into VC and badge formats.
- Weeks 3 to 5: publish capabilities feed v0.3, automate signing, ship ai.txt discovery.
- Weeks 6 to 8: run an OEM pilot with setup screen and side-button option, measured rev share and SLA.
- Weeks 9 to 10: secure a browser Ask AI tile with verified proofs and a 90-day premium offer.
- Weeks 11 to 12: expand to Android share sheet integration and begin iOS Siri extension outreach.
Risk controls
- Do not over-claim safety. Publish limits with links to failure modes in HELM scenarios.
- Keep evals fresh for every release with model commit IDs and dataset hashes.
- Ensure clear opt-in and retention windows for consumer data.
Bottom line
Anthropic’s consumer move and the rapid spread of Ask AI surfaces and OEM assistant pickers make defaults the new battleground. Teams that ship verifiable trust proofs, machine-readable capability feeds, and performance-backed co-marketing bundles will convert brand awareness into durable default positions. Reference the launch moment in Anthropic's "Keep Thinking" launch and anchor your proofs to W3C’s VC standard to accelerate partner acceptance.