Eko v2 Rollout User Checklist
Audience: Solo operator (Jonathan). Scope: Only items requiring human action on external services — API keys, accounts, config, DNS, manual verification. Code tasks are handled in branches and are out of scope.
Phase 1: Pre-Deploy (Accounts & Credentials)
1.1 Supabase
- Create production Supabase project (or confirm existing)
- Copy credentials from Dashboard > Settings > API:
| Env Var | Where to get it |
|---|---|
SUPABASE_URL | Dashboard > Settings > API > Project URL |
SUPABASE_ANON_KEY | Dashboard > Settings > API > anon public key |
SUPABASE_SERVICE_ROLE_KEY | Dashboard > Settings > API > service_role key |
DATABASE_URL | Dashboard > Settings > Database > Connection string (Transaction mode via Supavisor) |
- Enable Google OAuth provider: Dashboard > Authentication > Providers > Google
- Requires Google Cloud OAuth client ID + secret
- Enable Apple OAuth provider: Dashboard > Authentication > Providers > Apple
- Requires Apple Services ID + key from Apple Developer portal
- Configure Auth redirect URLs: add
https://app.eko.day/**to allowed redirect URLs - Confirm RLS is enabled on all tables (migrations create policies, but verify)
1.2 Upstash Redis
- Create a Redis database at console.upstash.com
- Copy credentials:
| Env Var | Where to get it |
|---|---|
UPSTASH_REDIS_REST_URL | Upstash Console > Database > REST API > Endpoint |
UPSTASH_REDIS_REST_TOKEN | Upstash Console > Database > REST API > Token |
1.3 AI Providers
- Anthropic API key from console.anthropic.com
- OpenAI API key from platform.openai.com (required for cost-optimized routing — 95% of calls use
gpt-4o-mini) - Google AI key from aistudio.google.com (required — Gemini 2.5 Flash powers the validation pipeline Phases 3 and 4c)
| Env Var | Where to get it |
|---|---|
AI_PROVIDER | Set to anthropic (default) |
ANTHROPIC_API_KEY | Anthropic Console > API Keys |
OPENAI_API_KEY | OpenAI Platform > API Keys |
GOOGLE_API_KEY | Required — Google AI Studio > API Keys (needed for fact validation pipeline) |
AI_MODEL_ANTHROPIC | (optional) Override default model, e.g. claude-haiku-4-5 |
AI_MODEL_OPENAI | (optional) Override default model, e.g. gpt-4o-mini |
AI_MODEL_GOOGLE | (optional) Override default model, e.g. gemini-2.0-flash |
ANTHROPIC_DAILY_SPEND_CAP_USD | (optional) Default 5.00 — daily spend cap before fallback to GPT-4o-mini |
OPUS_ESCALATION_ENABLED | (optional) Default false — enable Opus 4.6 escalation for complex cases |
OPUS_MAX_DAILY_CALLS | (optional) Default 20 — max Opus calls/day |
1.4 News APIs (Fact Engine)
- Obtain a news API key (e.g., NewsAPI.org, GNews, or similar)
- (Optional) Google News API key if using Google's news endpoint
| Env Var | Where to get it |
|---|---|
NEWS_API_KEY | Your news API provider dashboard |
GOOGLE_NEWS_API_KEY | (optional) Google API Console |
Tuning params (all have sane defaults):
| Env Var | Default | Purpose |
|---|---|---|
NEWS_INGESTION_INTERVAL_MINUTES | 15 | How often to poll for news |
FACT_EXTRACTION_BATCH_SIZE | 10 | Articles processed per extraction run |
VALIDATION_MIN_SOURCES | 2 | Minimum sources for fact validation |
NOTABILITY_THRESHOLD | 0.6 | Score threshold for publishing facts |
EVERGREEN_DAILY_QUOTA | 20 | Max evergreen facts generated/day |
EVERGREEN_ENABLED | false | Enable evergreen fact generation |
1.5 Stripe (Billing)
- Create Stripe account or confirm existing at dashboard.stripe.com
- Create 5 Price objects in Stripe (Products > Create product > Add price):
- Base plan (monthly)
- Pro plan (monthly)
- Team plan (monthly)
- v2: Plus plan — monthly
- v2: Plus plan — annual
- Create a webhook endpoint pointing to
https://app.eko.day/api/webhooks/stripe- Events to listen for:
checkout.session.completed,customer.subscription.updated,customer.subscription.deleted,invoice.payment_succeeded,invoice.payment_failed
- Events to listen for:
- Copy credentials:
| Env Var | Where to get it |
|---|---|
STRIPE_SECRET_KEY | Stripe Dashboard > Developers > API Keys > Secret key |
STRIPE_WEBHOOK_SECRET | Stripe Dashboard > Developers > Webhooks > Signing secret |
STRIPE_PRICE_BASE | Price ID from Stripe (starts with price_) |
STRIPE_PRICE_PRO | Price ID from Stripe |
STRIPE_PRICE_TEAM | Price ID from Stripe |
STRIPE_PRICE_PLUS_MONTHLY | v2 new — Price ID for Plus monthly |
STRIPE_PRICE_PLUS_ANNUAL | v2 new — Price ID for Plus annual |
Note:
.env.exampleis currently missingSTRIPE_PRICE_PLUS_MONTHLYandSTRIPE_PRICE_PLUS_ANNUAL. A code PR should add them.
1.6 Email (Resend)
- Create Resend account at resend.com
- Verify sending domain (e.g.,
eko.day) in Resend dashboard - Copy credentials:
| Env Var | Where to get it |
|---|---|
EMAIL_PROVIDER | Set to resend to enable |
RESEND_API_KEY | Resend Dashboard > API Keys |
RESEND_FROM_EMAIL | Verified sender address, e.g. Eko <alerts@eko.day> |
1.7 Error Tracking (Sentry)
- Create Sentry project at sentry.io (or confirm existing)
- Copy DSN:
| Env Var | Where to get it |
|---|---|
ERROR_TRACKING_PROVIDER | Set to sentry |
SENTRY_DSN | Sentry > Project Settings > Client Keys (DSN) |
SENTRY_AUTH_TOKEN | (optional, for source maps) Sentry > Settings > Auth Tokens |
SENTRY_ENVIRONMENT | (optional) Defaults to NODE_ENV |
1.8 Enrichment (Optional)
- (Optional) Brandfetch API key for brand logos
- (Optional) People Data Labs API key for company firmographics
| Env Var | Where to get it |
|---|---|
BRANDFETCH_API_KEY | brandfetch.com dashboard |
PDL_API_KEY | peopledatalabs.com dashboard |
1.9 Internal Secrets
- Generate a strong random
CRON_SECRET(e.g.,openssl rand -hex 32) - Set admin email allowlist
| Env Var | Value |
|---|---|
CRON_SECRET | Random 32+ char hex string |
ADMIN_EMAIL_ALLOWLIST | Comma-separated admin emails |
LOG_LEVEL | info for production |
NODE_ENV | production |
1.10 Application URLs
| Env Var | Production Value |
|---|---|
NEXT_PUBLIC_APP_URL | https://app.eko.day |
NEXT_PUBLIC_PUBLIC_URL | https://eko.day |
Phase 2: Database Migrations
2.1 Apply v2 Migrations (0090-0113)
All 24 migrations must be applied in order. Run via Supabase CLI or Dashboard SQL editor.
| # | File | Purpose |
|---|---|---|
| 0090 | 0090_ai_model_tiers.sql | AI model tier config table |
| 0091 | 0091_ai_model_tier_enum.sql | AI model tier enum type |
| 0092 | 0092_fact_engine_enums_and_taxonomy.sql | Fact engine enums & topic taxonomy |
| 0093 | 0093_fact_records.sql | Core fact_records table |
| 0094 | 0094_stories_and_news_sources.sql | Stories & news sources tables |
| 0095 | 0095_card_interactions_and_ingestion.sql | Card interactions & ingestion runs |
| 0096 | 0096_plan_updates_and_seed_taxonomy.sql | Plan updates & taxonomy seed data |
| 0100 | 0100_score_column.sql | Score column on fact records |
| 0101 | 0101_category_expansion.sql | Expanded topic categories |
| 0102 | 0102_card_format_free_text.sql | Card format free text column |
| 0103 | 0103_score_disputes.sql | Score dispute tracking |
| 0104 | 0104_reward_milestones.sql | Reward milestone tables |
| 0105 | 0105_challenge_formats.sql | Challenge format definitions |
| 0106 | 0106_challenge_format_junctions.sql | Challenge format junction tables |
| 0107 | 0107_interactions_format_tracking.sql | Interaction format tracking |
| 0108 | 0108_seed_challenge_formats.sql | Seed data for challenge formats |
| 0109 | 0109_challenge_sessions.sql | Challenge session management |
| 0110 | 0110_challenge_format_rls.sql | RLS policies for challenge formats |
| 0111 | 0111_drop_page_tracking_tables.sql | Drop legacy page tracking tables |
| 0112 | 0112_drop_brand_auxiliary_tables.sql | Drop legacy brand auxiliary tables |
| 0113 | 0113_drop_misc_legacy.sql | Drop misc legacy tables |
Gap note: Migrations 0097-0099 do not exist. This is intentional (reserved numbers).
- Back up production database before running migrations
- Apply migrations 0090-0096 (core v2 infrastructure)
- Apply migrations 0100-0110 (v2 features & challenge system)
- Apply migrations 0111-0113 (legacy table cleanup) — destructive, run last
- Verify all RLS policies are active:
SELECT tablename, policyname FROM pg_policies; - Run
bun run db:typesto regenerate TypeScript types from updated schema
2.2 Seed Data (Recommended)
- Verify taxonomy seed data was applied (migration 0096 seeds topic categories)
- Verify challenge format seed data was applied (migration 0108)
- (Optional) Seed feature flags via admin dashboard or SQL:
INSERT INTO feature_flags (key, enabled, description) VALUES ('ai_provider_anthropic', true, 'Enable Anthropic as AI provider'), ('ai_provider_openai', true, 'Enable OpenAI as AI provider'), ('ai_provider_google', false, 'Enable Google as AI provider');Feature flags are fail-open — if rows don't exist, providers default to enabled. Seeding is recommended for explicit control.
Phase 3: Deploy
3.1 Vercel (Web + Admin + Public)
- Connect GitHub repo to Vercel project
- Configure 3 Vercel projects (or mono-project with directory settings):
apps/web→app.eko.dayapps/admin→admin.eko.dayapps/public→eko.day
- Add all environment variables from Phase 1 to each Vercel project's settings
- Set
NODE_ENV=productionin Vercel environment settings
3.2 Register v2 Cron Routes in vercel.json
Current state: apps/web/vercel.json has only 5 legacy cron routes. The following 8 v2 fact-engine cron routes exist in code but are NOT registered — they won't fire until added:
| Route | Suggested Schedule | Purpose |
|---|---|---|
/api/cron/ingest-news | */15 * * * * (every 15 min) | Poll news APIs for new articles |
/api/cron/cluster-sweep | */30 * * * * (every 30 min) | Cluster similar articles into stories |
/api/cron/import-facts | */15 * * * * (every 15 min) | Extract facts from clustered articles |
/api/cron/generate-evergreen | 0 */6 * * * (every 6 hours) | Generate evergreen educational facts |
/api/cron/validation-retry | 0 */2 * * * (every 2 hours) | Retry failed fact validations |
/api/cron/archive-content | 0 3 * * * (3 AM daily) | Archive old content |
/api/cron/topic-quotas | 0 0 * * * (midnight daily) | Reset/enforce topic quotas |
/api/cron/daily-digest | 0 8 * * * (8 AM daily) | Send daily digest emails |
- Add these 8 routes to
apps/web/vercel.json(code change needed — PR in progress or create one) - Verify all 13 cron routes are registered after deploy:
- Existing:
payment-reminders,payment-escalation,monthly-usage-report,account-anniversaries,daily-cost-report - New v2:
ingest-news,cluster-sweep,import-facts,generate-evergreen,validation-retry,archive-content,topic-quotas,daily-digest
- Existing:
3.3 Fly.io Workers
Three Bun-based queue consumer workers need Fly.io deployment:
| Worker | App Directory | Purpose |
|---|---|---|
worker-ingest | apps/worker-ingest | News article ingestion & clustering |
worker-facts | apps/worker-facts | AI fact extraction from articles |
worker-validate | apps/worker-validate | Multi-tier fact validation |
For each worker:
- Create Fly.io app:
fly apps create eko-worker-ingest(repeat for each) - Create
fly.tomlin each worker directory (nofly.tomlexists yet) - Set secrets on each Fly app:
fly secrets set \ SUPABASE_URL=... \ SUPABASE_ANON_KEY=... \ SUPABASE_SERVICE_ROLE_KEY=... \ DATABASE_URL=... \ UPSTASH_REDIS_REST_URL=... \ UPSTASH_REDIS_REST_TOKEN=... \ ANTHROPIC_API_KEY=... \ OPENAI_API_KEY=... \ CRON_SECRET=... \ NODE_ENV=production \ -a eko-worker-ingest - Deploy each worker:
fly deploy -a eko-worker-ingest - Verify health endpoint responds:
curl https://eko-worker-ingest.fly.dev/health
3.4 DNS Configuration
| Record Type | Name | Value | Service |
|---|---|---|---|
| CNAME | app.eko.day | cname.vercel-dns.com | Vercel (web app) |
| CNAME | admin.eko.day | cname.vercel-dns.com | Vercel (admin) |
| A/CNAME | eko.day | Vercel IP or CNAME | Vercel (public site) |
- Add DNS records at your domain registrar / Cloudflare
- Add custom domains in Vercel project settings
- Verify SSL certificates are provisioned (Vercel auto-provisions via Let's Encrypt)
Phase 4: Post-Deploy Verification
4.1 Service Health Checks
- Web app loads at
https://app.eko.day— login works (Google/Apple OAuth) - Admin dashboard loads at
https://admin.eko.day— admin email can log in - Public site loads at
https://eko.day - Stripe webhook test event succeeds (Stripe Dashboard > Webhooks > Send test event)
- Cron jobs fire on schedule (Vercel Dashboard > Crons > verify next runs)
- Workers respond to health checks on Fly.io
4.2 Fact Engine Smoke Test
- Trigger manual news ingestion:
curl -H "Authorization: Bearer $CRON_SECRET" https://app.eko.day/api/cron/ingest-news - Verify articles appear in
ingestion_runstable - Trigger fact extraction:
curl -H "Authorization: Bearer $CRON_SECRET" https://app.eko.day/api/cron/import-facts - Verify facts appear in
fact_recordstable with statuspending_validation - Trigger validation: check
worker-validatelogs for processing - Verify at least one fact reaches
validatedstatus
4.3 Subscription Flow
- Create a test subscription via Stripe test mode
- Verify
user_subscriptionstable updates - Verify Eko+ features unlock in the app
- Test upgrade from Free to Plus (monthly and annual)
4.4 Error Tracking
- Trigger a test error and verify it appears in Sentry
- Verify source maps are working (stack traces show original TypeScript)
Phase 5: Ongoing Operations
5.1 Cost Monitoring
- Set up Anthropic usage alerts at console.anthropic.com
- Set up OpenAI usage alerts at platform.openai.com
- Set up Stripe billing alerts
- Monitor daily cost report emails (from
daily-cost-reportcron) - Review
ANTHROPIC_DAILY_SPEND_CAP_USD— adjust from default$5.00if needed
5.2 Feature Flag Management
Key feature flags to monitor/adjust via admin dashboard:
| Flag Key | Default | Purpose |
|---|---|---|
ai_provider_anthropic | true (fail-open) | Kill switch for Anthropic |
ai_provider_openai | true (fail-open) | Kill switch for OpenAI |
ai_provider_google | false | Enable Google AI (not yet integrated) |
inline_diffs | true | Legacy inline diff feature |
5.3 Content Moderation
- Set up a process to review published facts via admin dashboard
- Configure
NOTABILITY_THRESHOLD(default0.6) — increase to be more selective - Monitor topic distribution via
topic-quotascron output
5.4 Backup & Recovery
- Enable Supabase Point-in-Time Recovery (PITR) on Pro plan
- Document rollback procedure for migrations 0111-0113 (destructive — no auto-rollback)
Quick Reference: Full Environment Variable Checklist
All variable names match packages/config/src/index.ts envSchema exactly.
Required
| Env Var | Service |
|---|---|
SUPABASE_URL | Supabase |
SUPABASE_ANON_KEY | Supabase |
SUPABASE_SERVICE_ROLE_KEY | Supabase |
CRON_SECRET | Internal |
Strongly Recommended
| Env Var | Service |
|---|---|
DATABASE_URL | Supabase (Supavisor pooler) |
UPSTASH_REDIS_REST_URL | Upstash |
UPSTASH_REDIS_REST_TOKEN | Upstash |
ANTHROPIC_API_KEY | Anthropic |
OPENAI_API_KEY | OpenAI |
GOOGLE_API_KEY | Google AI (validation pipeline) |
STRIPE_SECRET_KEY | Stripe |
STRIPE_WEBHOOK_SECRET | Stripe |
STRIPE_PRICE_BASE | Stripe |
STRIPE_PRICE_PRO | Stripe |
STRIPE_PRICE_TEAM | Stripe |
STRIPE_PRICE_PLUS_MONTHLY | Stripe (v2 new) |
STRIPE_PRICE_PLUS_ANNUAL | Stripe (v2 new) |
RESEND_API_KEY | Resend |
RESEND_FROM_EMAIL | Resend |
SENTRY_DSN | Sentry |
Optional / Tuning
| Env Var | Default | Service |
|---|---|---|
AI_PROVIDER | anthropic | Config |
AI_MODEL_ANTHROPIC | (auto) | Config |
AI_MODEL_OPENAI | (auto) | Config |
AI_MODEL_GOOGLE | (auto) | Config |
ANTHROPIC_DAILY_SPEND_CAP_USD | 5.00 | Config |
OPUS_ESCALATION_ENABLED | false | Config |
OPUS_MAX_DAILY_CALLS | 20 | Config |
NEWS_API_KEY | — | News provider |
GOOGLE_NEWS_API_KEY | — | Google News |
NEWS_INGESTION_INTERVAL_MINUTES | 15 | Config |
FACT_EXTRACTION_BATCH_SIZE | 10 | Config |
VALIDATION_MIN_SOURCES | 2 | Config |
NOTABILITY_THRESHOLD | 0.6 | Config |
EVERGREEN_DAILY_QUOTA | 20 | Config |
EVERGREEN_ENABLED | false | Config |
CRON_BATCH_SIZE | 100 | Config |
EMAIL_PROVIDER | none | Config |
ERROR_TRACKING_PROVIDER | none | Config |
EMAIL_ENABLED | false | Config (legacy) |
SENTRY_ENABLED | false | Config (legacy) |
BRANDFETCH_API_KEY | — | Brandfetch |
PDL_API_KEY | — | People Data Labs |
ADMIN_EMAIL_ALLOWLIST | — | Config |
LOG_LEVEL | info | Config |
NODE_ENV | development | Config |
NEXT_PUBLIC_APP_URL | https://app.eko.day | Config |
NEXT_PUBLIC_PUBLIC_URL | https://eko.day | Config |
PORT | 8080 | Workers |
SENTRY_ENVIRONMENT | (NODE_ENV) | Config |