Script generation
Script Generation converts a test case into an executable automation script in your chosen language and framework. The script includes test steps, assertions, and the scaffolding needed to run it directly
This feature must be enabled by your admin in Test Case Settings.
Supported languages and frameworks
Python
pytest
Python
unittest
Python
Robot Framework
JavaScript
Jest
JavaScript
Playwright
TypeScript
Jest
Java
JUnit
C#
NUnit
API Testing
Postman Collection (JSON)
Custom
User-specified framework
Generating a single script
Configure the script
Fill in the generation settings:
Project Context
(Optional) Paste a public GitHub repo URL to tailor scripts to your project's libraries, patterns, and conventions.
Test Type
API, UI/E2E, Integration, or Unit
Language
Programming language or framework preset
Generation Profile
Project conventions, Strict (no external deps), or Custom
Additional Instructions
(Optional) Any guidance for code style, naming, assertions, or libraries to use
Target Path
(Optional) File path where this script should be placed in your repo (e.g. tests/auth/test_login.py)
Test Type differences
Each test type injects a specific hint into the AI prompt
API - Focuses on HTTP calls, request/response validation, status codes
UI/E2E - Focuses on browser interaction, selectors, visual assertions
Integration - Focuses on service interactions, data flows, side effects
Unit - Focuses on isolated function/method testing, mocking, edge cases
Connect repository context (optional but recommended)
For the best script quality, connect your repository so TCG can learn your project's conventions:
Click Connect Repository.
Follow the GitHub integration flow to authorise TCG.
TCG scans your repo and detects:
Primary language
Testing frameworks installed
Existing test file patterns
Build system (Maven, npm, Gradle, etc.)
Allowed imports list
This creates an Executable Profile that is used to generate scripts that fit your codebase.
Understanding the quality score
Every generated script is automatically evaluated with a Quality Score (0–100):
Assertion Coverage
30%
Number of assert / expect / verify statements relative to steps
Step Coverage
25%
Ratio of meaningful code lines to total script length
No Placeholders
20%
Absence of TODO, FIXME, pass, NotImplemented
Allowed Imports
15%
All imports are from the permitted list
No Secrets
10%
No hardcoded API keys, passwords, or tokens
Scripts scoring:
75–100: Green — ready to use
50–74: Yellow — usable, minor issues
< 50: Red / Draft — significant gaps, manual review required
Actions after generation
Copy to clipboard
Click the copy icon in the code viewer
Download script
Click Download - saves as a .py, .js, .ts, .java, .cs, or .json file
Push to GitHub
Click Push to GitHub - see GitHub integration
Regenerate
Click Regenerate to try again with the same or updated settings
Last updated

