Analytics
Path: Admin Dashboard → Analytics (bar chart icon in sidebar)
The Analytics page gives you a portal-level overview of usage, quality, and coverage - updated in real time as users generate test cases.
Time Range Filter
Use the time range selector at the top to scope all metrics and charts:

All stat cards, charts, and the recent activity table update immediately when you change the range.
Summary Stat Cards
Ten metric cards are shown in a 4-column grid at the top of the page.

Specifications
Number of unique specification documents processed
Test Cases
Total test cases generated across all runs
Active Users
Unique users who ran at least one generation in the selected period
Hours Saved
Estimated manual hours saved (based on average manual TC writing time per test case)
Avg. Rating
Mean user quality rating (out of 5) across all rated generations
Edit Rate
Percentage of test cases that were edited after generation
Avg. Coverage
Average requirement coverage across all generations - percentage of requirements that received at least one test case
Generations
Total generation runs in the selected period
AI Defect Rate
Percentage of test cases flagged as low-confidence by the AI model
AI Confidence
Average AI confidence score across all generated test cases
Avg. Coverage is calculated automatically at the end of every generation. No manual action is required. It reflects how completely the AI mapped your requirements to test cases.
Charts
Usage Over Time
A line chart showing the number of generations and total test cases produced per day over the selected period.
Use this to identify:
Peak usage days
The impact of sharing the portal with new teams
Drop-offs that may indicate friction in the workflow
Specification Types Distribution
A chart showing the breakdown of specification file types processed (e.g. PDF vs. DOCX).
This is useful for understanding your team's document format preferences.
Quality Rating Distribution
A histogram of user quality ratings (1-5 stars).
A concentration at higher ratings indicates users are satisfied with generation quality.
A concentration at lower ratings may indicate the need to review generation settings or prompt templates.
Recent Activity Table
Below the charts, a table shows the most recent generations in the selected period.

Portal Filter
If you manage multiple portals, use the Portal dropdown at the top to view analytics for a specific portal or across all portals at once.
Interpreting the data
High Hours Saved + Low Edit Rate - generation quality is high and users trust the output as-is.
High Edit Rate - users are modifying many test cases. Consider refining your format template or enabling the Requirements Review step.
Low Avg. Coverage - the AI is not covering all requirements. This may indicate poorly structured specifications or very large documents where some sections are missed.
High AI Defect Rate - many test cases are low-confidence. Review the specification quality and consider enabling the Requirements Review step so users can validate requirements before TC generation.
Low generation count - the portal may not be discoverable. Ensure the portal URL has been shared with your team.
Many delta generations - your team is actively managing specification revisions. Consider enabling GitHub integration for automated script delivery.
Last updated