Chain-of-Thought: Cost Anomaly Investigation

U

@

·

Systematic investigation of unexpected LLM cost increases

60 copies0 forks
Investigate this LLM cost anomaly systematically.

Cost Data:
{{cost_metrics}}

Normal Baseline:
{{baseline_costs}}

Anomaly Period:
{{anomaly_timeframe}}

Step 1: Quantify the anomaly magnitude and duration
Step 2: Correlate with traffic patterns - was there a usage spike?
Step 3: Analyze token consumption - input vs output distribution
Step 4: Check for prompt template changes during period
Step 5: Investigate model routing - any fallback to expensive models?
Step 6: Examine cache hit rates - did caching degrade?
Step 7: Review error rates - retries consuming extra tokens?

Identify root cause and recommend prevention measures.

Details

Category

Analysis

Use Cases

Cost investigationAnomaly detectionBudget forensics

Works Best With

claude-sonnet-4-20250514gpt-4o
Created Shared

Create your own prompt vault and start sharing