Loading…
Workflow goal
Generates Gherkin test cases using parallel documentation analysis for large context.
Variable Source Value Phase Status
All System Documentation
Phase 1: Metadata
— Not fetched
Phase 2: Raw Content
⏳ Pending
Phase 3: Final Content
⏳ Pending
Phase 1: Metadata
— Not fetched
Phase 2: Raw Content
⏳ Pending
Phase 3: Final Content
⏳ Pending
0/2 0/2 0/2
User Story
Phase 1: Metadata
— Not fetched
Phase 2: Raw Content
⏳ Pending
Phase 3: Final Content
⏳ Pending
0/1 0/1 0/1

Output Format Templates

xray_csv_format xray_test_case.csv Used in:Step 13

All System Documentation

430276610 ID: 430276610
*430276610 - Not yet fetched*
429654020 ID: 429654020
*429654020 - Not yet fetched*

User Story

AQV-75 ID: AQV-75
*AQV-75 - Not yet fetched*
1 System Context and Boundaries
Extract system context and boundaries from user story and all documentation
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Create a simple summary identifying what is within scope and out of scope for this user story.

## System Scope Summary

### Within Scope
- [Systems, components, processes that ARE part of this change]
- [Data elements, mappings, values that ARE affected]

### Out of Scope  
- [Systems, components, processes that are NOT part of this change]
- [Functionality that is NOT part of this user story]

### Testing Focus
- **Focus On**: [Main testing priorities]
- **Out of Scope**: [What is NOT part of this testing effort]

Keep summary short and actionable.
User Prompt (raw)
User Story:
{userInput}

System Documentation:
{all_system_docs}
2 User Story Analysis
Analyze user story with 10 focused questions within established boundaries
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Analyze the user story and create exactly 10 focused questions within established system boundaries.

First, think step by step about what aspects need investigation. Then create your focused questions.

Focus on:
- What information do we need about the core topic?
- What is the current state vs. the change?
- What existing functionality must continue working?

Number each question 1-10 for easy reference. Stay within the established system boundaries.
User Prompt (raw)
User Story:
{userInput}

System Boundaries:
{system_context_extraction}
3 Documentation Analysis (Aggregated)
PARALLEL: Analyze each system documentation in parallel to answer questions
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Analyze this documentation to answer questions from the User Story Analysis within system boundaries.

First, carefully review the questions and identify which ones this documentation can answer. Then create your analysis table.

Create a table with the following columns:

| ID | Question | Relevant section | Answer |
|----|----------|------------------|--------|

Rules:
- No markdown syntax in table cells (use plain text)
- Only include questions that CAN be answered from this documentation
- Focus on information within established system boundaries
- The table has to be valid markdown syntax.
User Prompt (raw)
System Boundaries:
{system_context_extraction}

User Story Analysis:
{user_story_analysis}

Documentation:
{all_system_docs}
4 Requirements Analysis
Extract test requirements from aggregated analysis results
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Extract test requirements specifically relevant to the user story change.

First, think step by step about what data elements, coverage areas, and requirements are directly impacted by this change. Then organize your findings.

## Requirements Analysis

### Data Elements (Change-Relevant)
| Data Element | Valid Values | Invalid Values | Change Impact |
|--------------|--------------|----------------|---------------|

### Coverage Areas
| Coverage Area | Why Relevant to Change | Priority |
|---------------|------------------------|----------|

### Requirements
| Requirement | How Change Affects It | Test Approach |
|-------------|----------------------|---------------|

Rules:
- Focus only on documented information relevant to the change.
- Mark unavailable info as *Not documented*
User Prompt (raw)
User Story:
{userInput}

System Boundaries:
{system_context_extraction}

Analysis Results:
{documentation_analysis}
5 Missing Information
Identify missing information needed for good test cases
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Identify missing data needed to create good test cases.

Create a simple list of follow-up questions:

## Missing Data Questions

1. [Specific missing value or data needed]
2. [Another missing piece of information]
3. [Additional data required for testing]

Focus on practical missing information like specific values, error codes, validation rules, or boundary values.
User Prompt (raw)
User Story:
{userInput}

System Boundaries:
{system_context_extraction}

Requirements Analysis:
{test_requirements_analysis}
6 Data Lookup (Aggregated from Chunks)
BLAZING FAST: Search all documentation chunks in parallel for missing data
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Search through this documentation chunk to answer the missing data questions.

For each question that can be answered from THIS chunk, provide:

## Follow-Up Answers

| Question | Relevant Section | Answer |
|----------|------------------|--------|

Rules:
- No markdown syntax in table cells (use plain text)
- Only include questions that CAN be answered from THIS specific chunk
- If found, extract relevant section
- Provide practical answers for test case creation
- Skip questions that cannot be answered from this chunk
User Prompt (raw)
System Boundaries:
{system_context_extraction}

Missing Data Follow-Up Questions:
{missing_data_followup}

Documentation Chunk:
{all_system_docs}
7 Test Plan
Create test plan with Feature and Regression tables
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Create test plan with four tables in two sections.

# Feature Test Cases (New/Changed Functionality)

## Positive
| Test Case ID | Test Case Title | Maps to Acceptance Criteria | Values to be used |
|--------------|-----------------|-----------------------------|-------------------|

## Negative  
| Test Case ID | Test Case Title | Maps to Acceptance Criteria | Values to be used |
|--------------|-----------------|-----------------------------|-------------------|

# Regression Test Cases (Existing Functionality Must Continue Working)

## Positive
| Test Case ID | Test Case Title | Related System Component | Values to be used |
|--------------|-----------------|--------------------------|-------------------|

## Negative
| Test Case ID | Test Case Title | Related System Component | Values to be used |
|--------------|-----------------|--------------------------|-------------------|

Test Plan Requirements:
- Only include test cases that are directly relevant to the user story change.
- Only include test cases that are backed by specific documented facts.
- The test title should be a short description of the business logic being tested.
- The 'Values to be used' column should contain the specific values that will populate the Examples table.
- Design the test plan for Scenario Outline + Examples pattern where it is applicable instead of repetitive test cases.
- Don't include Idempotency tests, performance tests, formatting tests, etc. if there is no documentation to back them up.
- Each test case must reference the source documentation and provide concrete data values, not placeholders or generic descriptions.
User Prompt (raw)
User Story: {userInput}
System Boundaries: {system_context_extraction}
Requirements Analysis: {test_requirements_analysis}
Follow-Up Answers: {followup_answers}
8 Test Plan Review
Review test plan and identify issues - analysis only
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Review the test plan and identify issues without fixing them - analysis only.

First, systematically examine each test case for redundancy, classification, data quality, and coverage gaps. Then document your findings.

# Test Plan Review

## Redundant Test Cases
| Test Case ID | Test Case Title | Redundant to | Action |
|--------------|-----------------|--------------|--------|

## Misclassified Test Cases
| Test Case ID | Test Case Title | Current Table | Action |
|--------------|-----------------|---------------|--------|

## Data Quality, Boundary or Scope Issues or Irrelevant Test Cases
| Test Case ID | Test Case Title | Data Issue | Action |
|--------------|-----------------|------------|--------|

## Coverage Gaps
| Missing Coverage Area | Acceptance Criteria Gap | Suggested Test Case |
|-----------------------|-------------------------|---------------------|

Test Plan Requirements:
- Only include test cases that are directly relevant to the user story change.
- Only include test cases that are backed by specific documented facts.
- The test title should be a short description of the business logic being tested.
- The 'Values to be used' column should contain the specific values that will populate the Examples table.
- Design the test plan for Scenario Outline + Examples pattern where it is applicable instead of repetitive test cases.
- Don't include Idempotency tests, performance tests, formatting tests, etc. if there is no documentation to back them up.
- Each test case must reference the source documentation and provide concrete data values, not placeholders or generic descriptions.
User Prompt (raw)
User Story: {userInput}
System Boundaries: {system_context_extraction}
Requirements Analysis: {test_requirements_analysis}
Follow-Up Answers: {followup_answers}
Test Plan to Review: {test_plan}
9 Final Test Plan
Apply fixes to produce final corrected test plan
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Apply fixes identified in the test plan review to produce the final corrected test plan.

# Test Plan

## Feature Test Cases

### Positive
| Test Case ID | Test Case Title | Example Data |
|--------------|-----------------|--------------|

### Negative
| Test Case ID | Test Case Title | Example Data |
|--------------|-----------------|--------------|

## Regression Test Cases

### Positive
| Test Case ID | Test Case Title | Example Data |
|--------------|-----------------|--------------|

### Negative
| Test Case ID | Test Case Title | Example Data |
|--------------|-----------------|--------------|

Test Plan Requirements:
- Only include test cases that are directly relevant to the user story change.
- Only include test cases that are backed by specific documented facts.
- The test title should be a short description of the business logic being tested.
- The 'Values to be used' column should contain the specific values that will populate the Examples table.
- Design the test plan for Scenario Outline + Examples pattern where it is applicable instead of repetitive test cases.
- Don't include Idempotency tests, performance tests, formatting tests, etc. if there is no documentation to back them up.
- Each test case must reference the source documentation and provide concrete data values, not placeholders or generic descriptions.
User Prompt (raw)
User Story: {userInput}
System Boundaries: {system_context_extraction}
Original Test Plan: {test_plan}
Test Plan Review: {test_plan_review}
10 Feature Test Cases
Generate feature test cases from verified test plan
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Generate Gherkin test cases for Feature tests from the final corrected test plan.

For each feature test case:
```gherkin
Feature: [Feature name]
Scenario Outline OR Scenario: [Test title]
  Given [setup for new/changed functionality]
  When [operation testing the change]
  Then [expected outcome]
  Examples:
    | [value 1] | [value 2] | [value 3] |
    | [value 4] | [value 5] | [value 6] |
```

Rules:
- Only include test cases from "feature positive" and "feature negative" sections.
- Do NOT add, remove or adjust the title of the test cases you are generating.
- Do NOT include any test case from "regression positive" or "regression negative" sections.
- Use documented data.
User Prompt (raw)
System Boundaries: {system_context_extraction}
Requirements Analysis: {test_requirements_analysis}
Final Test Plan: {final_test_plan}
11 Regression Test Cases
Generate regression test cases from verified test plan
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Generate Gherkin test cases for Regression tests from the final corrected test plan.

For each regression test case:
```gherkin
Feature: [System component]
Scenario Outline OR Scenario: [Test title]
  Given [setup for existing functionality]
  When [operation testing unchanged functionality]
  Then [expected outcome proving it still works]
  Examples:
    | [value 1] | [value 2] | [value 3] |
    | [value 4] | [value 5] | [value 6] |
```

Rules:
- Only include test cases from "regression positive" and "regression negative" sections.
- Do NOT add, remove or adjust the title of the test cases you are generating.
- Do NOT include any test case from "feature positive" or "feature negative" sections.
- Use documented data.
User Prompt (raw)
System Boundaries: {system_context_extraction}
Requirements Analysis: {test_requirements_analysis}
Final Test Plan: {final_test_plan}
12 Final Gherkin Test Cases
Create final comprehensive Gherkin test suite
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Combine the test cases into final output format.

# Test Suite

## Feature Positive
```gherkin
Feature: [Feature name]

Scenario Outline OR Scenario: [Test title]
  Given [setup for new/changed functionality]
  When [operation testing the change]
  Then [expected outcome]
  Examples:
    | [value 1] | [value 2] | [value 3] |
    | [value 4] | [value 5] | [value 6] |
```

## Feature Negative

## Regression Positive

## Regression Negative

Output Requirements:
- Present the result in gherkin syntax as clear code blocks for each group
- The groups are: Feature Positive, Feature Negative, Regression Positive, Regression Negative
User Prompt (raw)
System Boundaries:
{system_context_extraction}

Test Plan:
{final_test_plan}

Feature Test Cases:
{feature_test_cases}

Regression Test Cases:
{regression_test_cases}
13 Xray CSV Export
Export test cases to Xray CSV format
pending
Value will be available once processing is complete.
Input will be available once processing is complete.
System Prompt (raw)
Convert the Gherkin test cases into Xray CSV format.

You are given a CSV template with the exact header row, delimiter and an example row:

{xray_csv_format}

Use exactly the same header row and delimiter as shown in this template. Treat the example row as a format example only; do not reuse its literal values.

For each test case scenario:
- Set 'Zusammenfassung' to a short summary of the scenario in German
- Leave 'Beschreibung', 'Autor', 'Bearbeiter' and 'Komponente' empty
- Set 'Typ' to 'Test'
- Set 'Test Typ' to 'Cucumber'
- Put the full Gherkin scenario (Feature, Scenario, Given/When/Then, Examples) into 'Test Beschreibung' as multi-line text
- Leave 'Story' empty, unless a clear story key is provided in the input

Rules:
- Quote fields that contain semicolons or newlines (especially 'Test Beschreibung')
- Each test scenario becomes one CSV row
- Include the header row exactly as in the template
- Keep content concise but complete
- Preserve German language from test cases
User Prompt (raw)
Gherkin Test Cases:
{final_result}

Convert these test cases to CSV format using the header row and delimiter from the provided template.