Know exactly what your pipeline produces
Every plan includes a token allocation for AI operations. The analytics dashboard shows exactly how your tokens are spent — per-operation costs, source performance, generation efficiency, and usage trends across your entire content pipeline.
Token usage tracking
See exactly how many tokens each operation consumes. Every pipeline step is metered individually — scraping, keyword generation, trends checks, article generation, humanization, and research for original content.
Your token balance is visible at all times in the dashboard. Usage history shows daily, weekly, and monthly breakdowns by operation type, content group, and source. Identify which pipeline steps consume the most tokens and optimize accordingly.
Source performance
Monitor which sources produce the most relevant content. Track scraping success rates per source — how often scraping succeeds, how many articles pass your content group filters, and how many make it all the way to publication.
Identify your best-performing sources and the ones that waste tokens on low-relevance content. Use this data to refine your source list, adjust keyword rules, or reconfigure scraping schedules for better ROI on token spend.
What each operation costs
Every AI-powered step in your pipeline consumes tokens. Knowing the cost per operation helps you plan your content volume and choose the right plan.
| Operation | Tokens per use |
|---|---|
| Scrape an article | ~5 |
| Generate keywords | ~10 |
| Google Trends check | ~5 |
| AI article generation | ~30 |
| Humanization | ~5 |
| Original research | ~20 |
| Original article | ~30 |
Actual costs vary by article length and model used.
Cost transparency
Every operation is metered. The analytics dashboard shows your per-article cost across the full pipeline — from source scraping through humanization and publishing. Compare costs across content groups, sources, and AI models.
Identify the most cost-effective content strategies. See which model and template combinations deliver the best quality per token. Use historical data to forecast token usage and plan your content budget with confidence.
This feature is included on every paid plan. See plans and pricing →
Related Articles
5 ways to automate your news content pipeline
Manual content curation doesn't scale. Here are five automation strategies that modern content teams use to stay ahead.
Case studiesHow Brevity replaced three tools and cut costs by 60%
Brevity was running three separate tools for source monitoring, rewriting, and publishing. Consolidating to Newsmill cut their content operations cost by 60% while improving output quality.
// explore features