Software Studio
Standard software development lifecycle
Stage Pipeline
Stage Details
Understand the problem, define success, and elaborate into units
Hats
Understand the problem space, map the existing codebase, define scope and constraints, and identify technical risks and architectural implications. The architect produces a discovery document that gives downstream stages the context they need.
Break the intent into units with clear boundaries, define the dependency DAG, and write verifiable completion criteria for each unit. Each unit should be completable within a single bolt.
Break the intent into units with clear boundaries, define the dependency DAG, and write verifiable completion criteria for each unit. Each unit should be completable within a single bolt.
Review Agents
The agent **MUST** verify the discovery document fully maps the problem space and that unit elaboration covers the intent with no gaps or overlaps.
The agent **MUST** challenge whether the elaboration is technically achievable given the codebase, dependencies, and constraints discovered.
Visual and interaction design for user-facing surfaces
Hats
The agent **MUST** check consistency with the design system, verify all interaction states are covered, confirm responsive behavior at all breakpoints, and validate accessibility requirements.
Produce high-fidelity design artifacts from approved wireframes. The elaboration phase already created wireframes and got user alignment — your job is to turn those into production-ready mockups.
Review Agents
The agent **MUST** verify the design meets accessibility requirements and does not exclude users.
The agent **MUST** verify the design is internally consistent and aligns with the project's existing design system.
Define behavioral specifications and acceptance criteria
Hats
Define user stories, prioritize features, make scope decisions, and specify acceptance criteria from the user's perspective. Think in terms of what users do and see, not how the system implements it.
Write behavioral specs (given/when/then), define data contracts (API schemas, database models), and specify API contracts (endpoints, methods, request/response shapes). Precision matters — ambiguity in specs becomes bugs in code.
Review Agents
The agent **MUST** verify behavioral specifications and data contracts fully cover the intent with no ambiguous or missing scenarios.
The agent **MUST** challenge whether the specified behavior is implementable within the technical constraints.
Implement the specification through code
Hats
Implement code to satisfy completion criteria, working in small verifiable increments. Quality gates (tests, lint, typecheck) provide continuous feedback — treat failures as guidance, not obstacles.
Read the unit spec and prior stage outputs, plan the implementation approach, identify files to modify, assess risks, and search for relevant learnings. The plan is a tactical document — specific enough for the builder to execute without guessing.
Verify implementation satisfies completion criteria through multi-stage review. Stage 1: spec compliance (does it do what the criteria say?). Stage 2: code quality (is it well-written?). Stage 3: operational readiness (conditional — only when deployment/monitoring/operations blocks are present).
Review Agents
The agent **MUST** verify the implementation follows the project's architectural patterns and does not introduce structural debt.
The agent **MUST** verify the implementation correctly satisfies the behavioral specification and completion criteria.
The agent **MUST** identify performance regressions or inefficiencies in the implementation.
The agent **MUST** identify security vulnerabilities introduced by the implementation.
The agent **MUST** verify tests actually validate behavior, not just exercise code paths.
from Design stage
from Design stage
from Product stage
Deployment, monitoring, and operational readiness
Hats
Configure deployment pipeline, define infrastructure as code, set up CI/CD, and ensure deployment is repeatable and rollback-safe. Every deployment should be automated, auditable, and reversible.
Define SLOs (availability, latency, error rate), set up monitoring and alerting, and write runbooks for common failure modes. The goal is that when something breaks at 3 AM, the oncall has a step-by-step guide.
Review Agents
The agent **MUST** verify the system is observable enough to diagnose issues in production.
The agent **MUST** verify the deployment and operational configuration supports reliable production operation.
from Development stage
Threat modeling, security review, and vulnerability assessment
Hats
Defense verification — implement security controls for identified threats, add security tests that prove the controls work, and validate monitoring coverage for security events. Fix root causes, not symptoms.
Attack surface analysis, injection testing (SQL, XSS, command), auth bypass attempts, privilege escalation testing, and data exposure checks. Think like an attacker — find what automated scanners miss.
Verify all identified threats have documented mitigations, check OWASP Top 10 coverage, validate security test coverage, and ensure no critical or high findings remain unaddressed. The final gate before security sign-off.
STRIDE threat modeling for all data flows and trust boundaries. Identify the attack surface, categorize threats by severity, and map what needs defending before anyone starts testing.
Review Agents
The agent **MUST** challenge whether proposed mitigations actually address the threats they claim to.
The agent **MUST** verify the threat model is comprehensive and all identified threats have mitigations.
from Development stage
from Development stage
from Operations stage
Software Studio
Full software development lifecycle from inception through security review. Supports both single-stage (all disciplines merged) and multi-stage (sequential discipline progression) execution modes.