Batch scripts
Batch Script Generation creates automation scripts for multiple test cases at once, processing them in parallel and packaging the results as a downloadable ZIP file.
This feature must be enabled by your admin in Test Case Settings.
When to use Batch Generation
You want scripts for your entire test suite at once
You are setting up a new automation project and need a starter set of scripts
You want to push all scripts to GitHub in one operation
Opening the Test Automation Builder
From the results view, click Tests Scripts

The Test Automation Builder opens, showing your test case list and configuration panel.
Step-by-step walkthrough
Configure batch settings
Test Type
API, UI/E2E, Integration, or Unit
Language
Language and framework for all scripts
Generation Mode
Project, Production or Draft
Target Path
Base folder path in your repo (e.g. tests/)
Allow Drafts
If on, scripts with blocking preflight issues are generated as drafts instead of skipped
Connect repository context (optional)
Click Connect Repository to scan your repo and build an Executable Profile. This significantly improves the quality of generated scripts - the AI will use your actual testing framework, imports, and naming conventions.
See GitHub integration for the connection steps.
Review pre-flight classification
Before starting, TCG runs pre-flight checks on every selected test case and classifies them:
Queued
Ready to generate
Blocked
Has a blocking issue; will be skipped unless Allow Drafts is on
Warning
Has a warning; will be generated but may be a draft
Test cases with blocking issues are shown with a warning icon and a description of the problem. Fix the test case or enable Allow Drafts to include them anyway.
Start batch generation
Click Generate Scripts. The builder shows:
A progress bar with current count / total
Each test case's status updating in real time:
Running - Currently being generated
Generated - Script produced with score ≥ 50
Draft - Script produced but score < 50 (needs review)
Error - Generation failed after retries
You can click Stop at any time to abort remaining scripts in the queue. Already-completed scripts are preserved.
Understanding the results table
After generation, each row in the table shows:
Test Case
Title of the test case
Status
Generated / Draft / Blocked / Error
Script File Name
Auto-generated file name (e.g. test_001_user_login.py)
Quality Score
0–100 score with color indicator
Actions
Download, view, or retry individually

Retrying failed scripts
If a script failed (network error, AI timeout):
Click the Retry button on the failed row.
The script is re-generated with 1 retry attempt.
Downloading results
Download all as ZIP
Click Download All (ZIP). The ZIP contains:
One script file per generated test case
Files are named
test_NNN_<sanitized_title>.<ext>in orderA
manifest.jsonwith metadata (TC IDs, quality scores, file names)

Download individual scripts
Click the Download icon on any row to download that single script.
Pushing to GitHub
Click Push All to GitHub to commit all generated scripts to your repository.
See GitHub integration for details on how to connect and where files are committed.
Saving automation results
Batch results are saved locally. If you close the browser and return to the same generation via History, the previously generated scripts and their scores are preserved.
Last updated