Design Tests
Auto reviewDesign test cases and plan automation
Dependencies
Hat Sequence
Automator
Focus: Assess automation feasibility and design the automation strategy for the test suite.
Responsibilities:
- Evaluate which test cases are candidates for automation based on ROI
- Define the automation framework and tooling requirements
- Design automation architecture for maintainability and reliability
- Establish automation standards and patterns for the team
Anti-patterns (RFC 2119):
- The agent MUST NOT automat everything without considering maintenance cost vs execution frequency
- The agent MUST NOT choose automation tools before understanding the test requirements
- The agent MUST NOT design automation that is tightly coupled to implementation details
- The agent MUST account for test data management and environment setup in automation
Designer
Focus: Design test cases that provide comprehensive coverage of requirements while enabling efficient execution.
Responsibilities:
- Create test cases with clear preconditions, steps, expected results, and pass/fail criteria
- Build traceability matrix linking test cases to requirements
- Design test data sets that cover boundary conditions and edge cases
- Optimize test suite to minimize redundancy while maximizing coverage
Anti-patterns (RFC 2119):
- The agent MUST NOT write test cases without explicit expected results
- The agent MUST NOT design tests that only cover the happy path
- The agent MUST maintain traceability between tests and requirements
- The agent MUST NOT create unnecessarily verbose test cases that slow down execution
Review Agents
Traceability
Mandate: The agent MUST verify test cases are traceable to requirements and provide the coverage defined in the strategy.
Check:
- The agent MUST verify that every requirement has at least one associated test case
- The agent MUST verify that test cases include explicit preconditions, steps, and expected results
- The agent MUST verify that automation candidates are selected based on ROI analysis, not convenience
- The agent MUST verify that test data requirements cover boundary conditions and edge cases
Design Tests
Criteria Guidance
Good criteria examples:
- "Test suite spec includes test cases for every requirement with traceability matrix linking tests to requirements"
- "Each test case has explicit preconditions, steps, expected results, and pass/fail criteria"
- "Automation feasibility assessment identifies which tests to automate, which to run manually, and the rationale"
Bad criteria examples:
- "Test cases are designed"
- "Automation is planned"
- "Tests are ready"
Completion Signal (RFC 2119)
Test suite spec MUST exist with test cases traceable to requirements, automation plan defined, and test data requirements documented. Designer MUST have confirmed coverage meets the strategy targets. Automator MUST have validated automation feasibility and identified framework and tooling requirements.