Inputs and rates
Example data table
| Scenario | Ingest (GB/day) | Retention (days) | Archive (%) | Query (GB/day) | Users | Alerts | Dashboards | Export (GB/month) |
|---|---|---|---|---|---|---|---|---|
| Small app | 5 | 30 | 0 | 20 | 3 | 10 | 5 | 0 |
| Growing platform | 40 | 45 | 20 | 250 | 12 | 60 | 18 | 120 |
| Large workload | 200 | 90 | 50 | 1200 | 40 | 300 | 60 | 900 |
Formula used
monthly_ingest_gb = monthly_ingest_gb_raw × (1 + ingest_overhead_pct/100)
Overhead models parsing, enrichment, retries, and duplicates.
stored_gb = stored_gb_raw × (1 + ingest_overhead_pct/100)
archived_gb = stored_gb × (archive_pct/100)
hot_stored_gb = stored_gb − archived_gb
monthly_query_gb = monthly_query_gb_raw × (1 + query_overhead_pct/100)
index_cost = monthly_ingest_gb × rate_index_per_gb
hot_storage_cost = hot_stored_gb × rate_hot_storage_per_gbmo
archive_cost = archived_gb × rate_archive_storage_per_gbmo
query_cost = monthly_query_gb × rate_query_per_gb
users_cost = users × rate_user_per_user
alerts_cost = alerts × rate_alert_per_rule
dashboards_cost = dashboards × rate_dash_per_dash
egress_cost = export_gb_per_month × rate_egress_per_gb
How to use this calculator
- Enter your average daily ingestion volume and retention days.
- Add archive percentage if you tier older logs.
- Estimate daily scanned data from typical searches and dashboards.
- Set counts for users, alerts, dashboards, and exports.
- Adjust rates to match your provider pricing sheet.
- Press Calculate to see totals and breakdown above.
Cost drivers in log analytics
Log analytics spending usually comes from four buckets: ingest, storage, scanning, and extras. Ingest scales with daily volume and any parsing or enrichment overhead. Storage scales with the average data kept across your retention window. Scanning scales with how much data queries read, not how many queries you run. Extras include user access, alert rules, dashboards, and exported data charges. A breakdown helps you target the biggest lever.
Estimating ingestion accurately
Start with an average GB-per-day value from your log pipeline or agent reports. If you compress or transform logs, measure after transformation, then add an overhead percentage to represent duplicate events, retries, or enrichment fields. Multiply by billing days to estimate monthly ingestion. For seasonal workloads, model peak and baseline months separately and compare totals. This calculator lets you adjust volume and overhead to see sensitivity.
Retention and archive strategy
Retention influences cost by determining how much data is stored at any moment. A practical steady-state estimate is daily ingestion multiplied by retained days. If older logs can move to an archive tier, apply an archive percentage to shift part of the stored volume to a lower rate. Compliance often needs longer retention for security events, while application debugging may only need weeks. Use scenarios to balance risk, performance, and budget.
Query scanning optimization
Query cost depends on scanned GB. High-cardinality searches, broad time ranges, and missing filters can multiply scanning. Reduce scanned data by tightening time windows, using structured fields, partitioning by service, and pre-aggregating common metrics. Also consider saved searches and dashboards that refresh frequently. Add query overhead to represent reruns, automated reports, and incident response bursts. Track “GB scanned per day” as a first-class operations metric.
Governance and forecasting
Once you have a baseline, forecast growth by increasing ingestion and adjusting user counts as teams adopt observability. Tie alert and dashboard counts to services and environments to avoid unowned assets. Review export volumes for downstream data lakes and cross-region transfers. Revisit rates when provider contracts change, then rerun the calculator to validate budgets. A monthly trend of unit metrics—GB ingested, GB stored, GB scanned—keeps surprises out of finance reviews.
FAQs
What inputs matter most for accuracy?
Daily ingestion and retention drive most storage and indexing costs. Next, estimate scanned data per day, because wide queries can dominate spend. Keep rates aligned to your provider pricing and revise assumptions after one billing cycle.
How should I estimate GB scanned per day?
Use query logs or dashboard refresh statistics if available. Otherwise, approximate by multiplying average scanned GB per query by queries per day, then add overhead for reruns during incidents. Start conservative and refine monthly.
Why does the calculator use daily ingestion × retention days?
It approximates steady-state stored volume when logs arrive evenly. If your workload is bursty, model separate months or increase overhead. For newly deployed systems, early months may store less than steady state.
When should I use archiving?
Archive helps when you need long retention but rarely query older logs. Move cold periods to cheaper storage and keep hot data searchable. Validate restore time, compliance requirements, and any rehydration fees before relying on archives.
How can I reduce log analytics cost quickly?
Lower scanned data by narrowing time ranges, adding filters, and using structured fields. Reduce ingestion by sampling noisy debug logs and lowering verbosity in production. Right-size alert rules and dashboard refresh schedules.
Do exports affect cost significantly?
They can, especially when exporting to other regions or external systems. Track monthly export GB from pipelines and scheduled jobs. If costs climb, aggregate before export and avoid moving full-fidelity logs unnecessarily.