aura-labs.ai

Feature Readiness Standard

Every feature merged to main must satisfy the acceptance criteria defined in this standard. This applies equally to new features, enhancements, bug fixes, and refactors that touch user-facing endpoints or business-critical paths.

This is not a suggestion — it is a blocking requirement. Pull requests that do not meet these criteria must not pass the pipeline gate.

Why This Exists

AURA handles real money. A feature that works but can’t be monitored, secured, traced, or tested in production is a liability. The Access Worldpay platform taught us that operational maturity comes from discipline at the feature level, not from bolting quality onto a running system after the fact.

The Core Principle

Define what “done” looks like before writing code.

This principle applies to everything: tests, security, observability, API contracts, and documentation. These are not separate activities performed at different stages. They are all part of answering one question before the first line of implementation is written: “What must be true for this feature to be complete?”

Every feature begins with acceptance criteria. Those criteria are written into the PR description or linked issue before the first implementation commit. The criteria cover all of the domains below. If a domain doesn’t apply (e.g., the change doesn’t introduce a new endpoint), note that explicitly — don’t leave it ambiguous.

Acceptance Criteria Domains

1. Functional Correctness

What logic paths must be tested? What edge cases? What error conditions? What end-to-end flows must work? What database state transitions?

Before code:

2. Security

What are the security requirements for this feature? What attack surfaces does it introduce or modify?

Before code, answer these questions against the API Security Baseline (docs/security/API_SECURITY_BASELINE.md):

Write specific, testable security criteria. Examples of good criteria:

Examples of criteria that are too vague to be useful:

3. API Contract

What does the request/response interface look like?

Before code:

4. Observability

How will this feature be monitored in production?

Before code:

5. Documentation

What must be updated?

Verification (CI-Enforced)

Criteria are only as good as their verification. Every domain above must have corresponding tests that run in CI.

Functional tests

Unit and integration tests that verify the logic paths defined in domain 1.

Security tests

Tests that verify the security behaviour defined in domain 2. These are not optional supplements — they are acceptance tests, same as any functional test.

Required patterns:

Observability tests

Tests that verify the metric emission and trace propagation defined in domain 4. Use the MetricsTestHelper from src/lib/test-helpers/metrics-helper.js.

Required patterns:

Example:

import { MetricsTestHelper } from '../lib/test-helpers/metrics-helper.js';
import { sessionsCreatedTotal } from '../lib/metrics.js';

test('POST /sessions increments sessions_created counter', async () => {
  const helper = new MetricsTestHelper();
  await helper.snapshot(); // MUST await

  const response = await app.inject({
    method: 'POST',
    url: '/v1/sessions',
    payload: { intent: 'buy running shoes' },
  });

  assert.equal(response.statusCode, 200);
  await helper.assertCounterIncremented(sessionsCreatedTotal, 1);
});

PR Checklist

Copy this into your PR description:

## Feature Readiness Checklist

### Acceptance Criteria (defined before code)
- [ ] Functional test criteria defined (unit paths, integration flows, state transitions)
- [ ] Security criteria defined against API Security Baseline (auth, authorization, input validation, SSRF)
- [ ] API contract defined (request/response schemas, pagination, idempotency)
- [ ] Observability criteria defined (metrics, trace propagation, logging, SLO impact)
- [ ] Documentation needs identified

### Implementation Verification
- [ ] Unit tests pass (including authorization boundary tests)
- [ ] Integration tests pass
- [ ] Security acceptance tests verify authorization, authentication, input validation
- [ ] Observability tests verify metric emission and labels
- [ ] Performance tests pass (if latency-sensitive path)

### Documentation
- [ ] API docs updated
- [ ] Decision log entry (if architectural choice made)
- [ ] DEPLOYMENT.md updated (if new env vars or infra)
- [ ] Security annotations on route handlers (OWASP categories considered)

Enforcement

The CI pipeline (Pipeline gate) blocks merge if any test job fails. Security tests, observability tests, and functional tests all run as part of the standard test suite. There are no separate gates — security and observability are not optional add-ons, they are part of the definition of done.

If a PR adds a new endpoint without corresponding authorization boundary tests and metric emission tests, the review must flag it as incomplete.

Reference Documents