1. Start Workers
Purpose: Start the three background workers locally so they can process queue messages for ingestion, fact extraction, and validation.
Prerequisites:
- Bun installed with
bun installcompleted .env.localconfigured with Upstash Redis and Supabase credentials- AI provider API key for the model backing each tier (check
ai_model_tier_configDB table for current assignments; e.g.,GOOGLE_API_KEYif tiers use Gemini,ANTHROPIC_API_KEYif Claude)
Cost / Duration: $0 | ~2 minutes
Prompt
Start all three pipeline workers locally. Run each in a separate terminal:
1. **Ingestion worker** (handles INGEST_NEWS, CLUSTER_STORIES, RESOLVE_IMAGE):
```bash
bun run dev:worker-ingest
```
2. **Facts worker** (handles EXTRACT_FACTS, GENERATE_EVERGREEN, EXPLODE_CATEGORY_ENTRY, and related messages):
```bash
bun run dev:worker-facts
```
3. **Validation worker** (handles VALIDATE_FACT):
```bash
bun run dev:worker-validate
```
After starting all three, verify each is healthy:
```bash
curl http://localhost:8080/health
```
If a worker fails to start, check terminal output for errors. Common issues:
- Redis connection error: verify UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN in `.env.local`
- Supabase auth error: verify SUPABASE_URL and SUPABASE_SERVICE_ROLE_KEY in `.env.local`
- Port already in use: kill the existing process or set a different PORT env var
Verification
- Ingestion worker started and healthy
- Facts worker started and healthy
- Validation worker started and healthy
- Health endpoint returns HTTP 200 for each worker
Related Prompts
- Run Workers High Concurrency -- Scale up for large pipeline runs
- Check Queue Health -- Verify queues are clear before processing