Loading source
Pulling the file list, source metadata, and syntax-aware rendering for this listing.
Source from repo
Comprehensive Cloudflare platform skill covering Workers, D1, R2, KV, AI, Durable Objects, and security.
Files
Skill
Size
Entrypoint
Format
Open file
Syntax-highlighted preview of this file as included in the skill package.
references/observability/configuration.md
1## Configuration Patterns23### Enable Workers Logs45```jsonc6{7"observability": {8"enabled": true,9"head_sampling_rate": 1 // 100% sampling (default)10}11}12```1314**Best Practice**: Use structured JSON logging for better indexing1516```typescript17// Good - structured logging18console.log({19user_id: 123,20action: "login",21status: "success",22duration_ms: 4523});2425// Avoid - unstructured string26console.log("user_id: 123 logged in successfully in 45ms");27```2829### Enable Workers Traces3031```jsonc32{33"observability": {34"traces": {35"enabled": true,36"head_sampling_rate": 0.05 // 5% sampling37}38}39}40```4142**Note**: Default sampling is 100%. For high-traffic Workers, use lower sampling (0.01-0.1).4344### Configure Analytics Engine4546**Bind to Worker**:47```toml48# wrangler.toml49analytics_engine_datasets = [50{ binding = "ANALYTICS", dataset = "api_metrics" }51]52```5354**Write Data Points**:55```typescript56export interface Env {57ANALYTICS: AnalyticsEngineDataset;58}5960export default {61async fetch(request: Request, env: Env): Promise<Response> {62// Track metrics63env.ANALYTICS.writeDataPoint({64blobs: ['customer_123', 'POST', '/api/v1/users'],65doubles: [1, 245.5], // request_count, response_time_ms66indexes: ['customer_123'] // for efficient filtering67});6869return new Response('OK');70}71}72```7374### Configure Tail Workers7576Tail Workers receive logs/traces from other Workers for filtering, transformation, or export.7778**Setup**:79```toml80# wrangler.toml81name = "log-processor"82main = "src/tail.ts"8384[[tail_consumers]]85service = "my-worker" # Worker to tail86```8788**Tail Worker Example**:89```typescript90export default {91async tail(events: TraceItem[], env: Env, ctx: ExecutionContext) {92// Filter errors only93const errors = events.filter(event =>94event.outcome === 'exception' || event.outcome === 'exceededCpu'95);9697if (errors.length > 0) {98// Send to external monitoring99ctx.waitUntil(100fetch('https://monitoring.example.com/errors', {101method: 'POST',102body: JSON.stringify(errors)103})104);105}106}107}108```109110### Configure Logpush111112Send logs to external storage (S3, R2, GCS, Azure, Datadog, etc.). Requires Business/Enterprise plan.113114**Via Dashboard**:1151. Navigate to Analytics → Logs → Logpush1162. Select destination type1173. Provide credentials and bucket/endpoint1184. Choose dataset (e.g., Workers Trace Events)1195. Configure filters and fields120121**Via API**:122```bash123curl -X POST "https://api.cloudflare.com/client/v4/accounts/{account_id}/logpush/jobs" \124-H "Authorization: Bearer <API_TOKEN>" \125-H "Content-Type: application/json" \126-d '{127"name": "workers-logs-to-s3",128"destination_conf": "s3://my-bucket/logs?region=us-east-1",129"dataset": "workers_trace_events",130"enabled": true,131"frequency": "high",132"filter": "{\"where\":{\"and\":[{\"key\":\"ScriptName\",\"operator\":\"eq\",\"value\":\"my-worker\"}]}}"133}'134```135136### Environment-Specific Configuration137138**Development** (verbose logs, full sampling):139```jsonc140// wrangler.dev.jsonc141{142"observability": {143"enabled": true,144"head_sampling_rate": 1.0,145"traces": {146"enabled": true147}148}149}150```151152**Production** (reduced sampling, structured logs):153```jsonc154// wrangler.prod.jsonc155{156"observability": {157"enabled": true,158"head_sampling_rate": 0.1, // 10% sampling159"traces": {160"enabled": true161}162}163}164```165166Deploy with env-specific config:167```bash168wrangler deploy --config wrangler.prod.jsonc --env production169```