Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

AgileTest Generator: Preparing Test Cases Through AI-Guided Conversations

Hi guys! I'm Kayson from AgileTest.

AgileTest Generator is built to change how teams prepare test cases by turning conversations with AI into executable test cases. Testers can collaborate with the  AI assistant that understands context, asks follow-up questions, and guides test design step by step.

To understand why conversation-based test case preparation matters today, it’s helpful to first look at how test case preparation has evolved over time. This article first explores how test case preparation has evolved from fully manual processes to AI-integrated workflows. Then it will examine the remaining challenges in current approaches. Finally, it demonstrates how AgileTest Generator enables a conversation-driven path from requirements to executable test cases.

1. The Evolution of Test Case Preparation

Before advanced AI-integrated testing tools became available, most teams relied heavily on manual effort to design and document test cases. Over time, test case preparation has evolved as teams adopt new tools and technologies to reduce effort and improve efficiency.

In practice, this evolution can be seen across three main levels of test case preparation maturity: 

The Evolution of Test Case Preparation.png

Level 1: Fully Manual Test Case Preparation (Before 2020)

At the earliest stage, teams create test cases entirely manually. At this level, testers prepare test cases independently, creating each one step by step from scratch. QA/QC teams strictly follow fundamental test process by reviewing requirements and acceptance criteria, identifying testable items, and defining detailed test steps and expected results.

During this period, AI tools for test design were not yet available to the public. Teams relied entirely on human experience and manual effort. Some teams attempted to reduce effort by reusing test case templates or modifying previously created test cases. However, these workarounds still required significant time for review, rework, and validation. As projects grew, this approach often led to scalability challenges, with test preparation consuming a significant portion of the testing timeline.

Level 2: Using General-Purpose AI for Test Ideation (2020 – 2024)

Around 2022, the first public demonstrations of ChatGPT marked a turning point in how testers approached test case preparation. This was soon followed by other general-purpose AI models such as Gemini and Copilot. At this stage, testers began leveraging AI tools to generate lists of potential test scenarios based on requirement descriptions. Teams can ask these models to suggest standard flows, edge cases, and negative scenarios instead of manual brainstorming.

However, the overall workflow remained fragmented. Although AI could suggest ideas, testers still had to manually review and filter the generated scenarios, reformat the content to match internal standards, and copy the results into their test management tools. Additional effort was required to validate that the suggested test cases aligned with project-specific requirements, naming conventions, and coverage expectations.

As a result, while this approach accelerated test ideation, it did not eliminate manual steps. Testers remained responsible for structuring inputs correctly and ensuring the final test cases were accurate and ready for execution.

Discover how to perform effective test analysis now.

Level 3: AI Integrated into Test Management Tools (2024 – Present)

Gradually, more testing applications began integrating AI models directly into their platforms. In this approach, testers provide structured inputs (requirements, acceptance criteria, etc ) within the test management tool itself. The integrated AI then analyzes this information and generates well-structured test cases, including scenarios, test steps, and expected results, in the appropriate format. For example, with the AI Generator in the AgileTest app, testers can select a requirement with a detailed description, and the AI agent uses these details as structured input to generate test cases with different scenarios, test steps, and expected results. 

Although this approach is more advanced and tightly integrated, it does not remove the need for human input. Teams still need to brainstorm testing ideas, clarify requirements, and provide meaningful context so the AI can interpret what needs to be tested.

2. What If … the Test Case Preparation Can Even Be More Effective?

Even with AI integrated into testing applications, test case preparation can still be improved. Testers are often required to provide detailed inputs, manually structure information, and carefully guide the integrated AI to generate usable test cases. Teams still need to think through test scenarios, decide what should be covered, and ensure the generated results align with business requirements and testing standards.

This means that while Level 3 reduces operational friction, it does not completely eliminate manual thinking or decision-making. The opportunity lies in making AI interactions more natural and efficient, where testers can describe requirements and testing intent through simple conversations, and the AI can transform that context into structured, executable test cases with minimal additional effort.

3. AgileTest Generator

AgileTest Generator is designed to address the remaining gaps in Level 3 test case preparation by changing how testers interact with AI during test design. Developed as a customized Rovo Agent within AgileTest, the Generator focuses on requirement-related testing activities. It helps teams refine requirement descriptions, identify missing details, and progressively transform requirements into test-ready test cases.

It behaves like an assistant that works alongside the tester throughout the process. Rather than asking testers to manually structure all inputs upfront, AgileTest Generator supports a conversational workflow. Testers can start with an existing requirement and interact with the agent through simple prompts. Based on the context, the agent guides the next steps: suggesting test scenarios and eventually generating structured test cases with detailed steps and expected results.

What differentiates AgileTest Generator from traditional AI-assisted tools is the way testers interact with it. Instead of filling in predefined inputs, testers engage in an ongoing conversation with the AI agent. This approach reduces the workload burden on testers. Teams are no longer required to translate ideas into rigid formats before seeing results. Instead, they collaborate with the AI in a way that feels closer to working with a teammate: one that assists with test design while leaving key decisions, validation, and prioritization in human hands.

4. From Conversation to Execution: A Step-by-Step Example

Step 1: Choose A Specific Requirement

After finishing all the installation and setup, you can click Ask Rovo inside the AgileTest app, then find AgileTest Generator. You can start the new chat and request the agent to create some test cases for a specific requirement. 

Step 1_ Choose A Specific Requirement.png

But what if this requirement is blank or not well-defined? No worries, you can directly ask the agent to improve the requirement description. 

Step 2: Improve Requirement Descriptions

You can ask the AgileTest Generator to generate details to clarify your requirement description. It will provide you with a draft that defines the goals, restricts the testing scope, determines the criteria to complete a test, etc. You can keep requesting the agent to modify these details until they perfectly match your preference. 

Step 2_ Improve Requirement Description.png

Step 3: Suggest Test Cases

After all the requirement details are ready, you can ask the agent to create test cases based on these updates. It will ask you more follow-up questions to clarify your requests. You just need to follow the conversation flow, and executable test cases with detailed test steps are ready to use. 

Step 3_ Suggest Test Cases.png

Step 4: Generate Test Cases

After you’ve already done with your adjustment, you can confirm which test cases are to be generated. All the generated test cases will be linked directly to the requirement you have selected initially. Therefore, it ensures clear traceability and visibility between requirements and the corresponding test cases.

Step 4_ Generate Test Cases.png

Final thought

Test case preparation has evolved, but turning ideas into executable test cases still takes effort. AgileTest Generator simplifies this by using conversation as the starting point for test design, helping teams move from requirements to execution more naturally. Instead of replacing testers, it supports them by reducing manual work while keeping human judgment at the center of quality.

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events