Reviewing editing

After generation, you arrive at the Results view. This is where you read, refine, and manage your test cases.


Browsing test cases

Expanding a test case

Click any test case row to expand it and see its full content:

  • Title - Concise description of what is being tested

  • Objective - One-sentence purpose statement

  • Preconditions - Setup required before executing the test

  • Test Steps - Numbered sequence of actions

  • Expected Result - Observable outcome that constitutes a pass

  • Priority - Critical / High / Medium / Low

  • Requirement Reference - Link back to the source requirement

Sorting

Use the sort controls to order by:

  • ID - Original generation order

  • Confidence Score - AI-assessed quality (highest first)

  • Priority - Critical items first

Searching

Type in the search bar to filter test cases by keyword. Results match against title, steps, and expected results. Results update as you type.

Filtering by requirement

Use the Requirement filter dropdown to show only test cases linked to a specific requirement.


Editing a test case

Inline edit

1

Expand the test case and click any test case row to open the full view.

2

Click Edit (pencil icon). All fields become editable text areas.

3

Make changes: Edit the Title, Objective, Preconditions, Test Steps, Expected Result, Priority, or Requirement Reference as needed.

4

Click Save, your changes are saved locally and synced to the database automatically, or click Cancel to discard changes.

circle-info

Edits are auto-saved to the database in the background. You can safely close the tab after editing and your changes will be there when you return.


Managing individual test cases

Deleting a test case

1

Open the test case row to reveal actions.

2

Click the Delete (trash) icon.

3

Confirm deletion. The test case is removed from the list. Auto-save updates the database. This action affects coverage metrics.

Copying content to clipboard

1

Expand the test case and open the full test case view.

2

Click Copy (clipboard icon).

3

The full test case text is copied to your clipboard.


Spec Coverage panel

The Coverage tab (or the side panel in widescreen) shows a summary of how well your requirements are covered:

Metric
Meaning

Requirements covered

Number of requirements with at least one test case

Spec section coverage %

Covered / Total requirements

AVG AI Confidence

AI Defect rate

A yellow or red indicator appears when the single-coverage ratio exceeds the threshold configured by your admin.


Requirements view

Click the Requirements tab to see all extracted requirements and the reference to it.


Confidence scores

Each test case has a Confidence Score 'AI Score' badge (0–100) based on three sub-metrics:

Sub-metric
What it measures

Requirement Coverage

How well this test case reflects the requirement you provided.

Spec Utilization

How much the test case uses the provided specification/context beyond the requirement itself. Low values mean the requirement alone was sufficient.

Grounded Accuracy

Whether the test case stays factual without adding unsupported assumptions (no hallucinations).

  • Green badge (70–100): High confidence

  • Yellow badge (40–69): Moderate confidence, review recommended

  • Red badge (0–39): Low confidence, manual review strongly advised


Productivity badge

At the top of the results view, a Productivity badge estimates the time saved compared to writing test cases manually. This is a motivational metric based on average manual TC writing times.

Last updated