Add test performance optimization requirements
- Remove time estimates from test stages - Add Test Performance Optimization section with: - Requirement to time tests and investigate outliers - Common issue: tests awaiting timeouts instead of conditions - Code example showing bad vs good async waiting patterns - Guidance on fixture caching and parallelization
This commit is contained in:
21
SKILL.md
21
SKILL.md
@@ -72,7 +72,6 @@ project/
|
|||||||
- **Location**: `services/<name>/tests/unit/`
|
- **Location**: `services/<name>/tests/unit/`
|
||||||
- **Dependencies**: All mocked
|
- **Dependencies**: All mocked
|
||||||
- **Containers**: None
|
- **Containers**: None
|
||||||
- **Speed**: Seconds
|
|
||||||
|
|
||||||
### Stage 2: Service Tests
|
### Stage 2: Service Tests
|
||||||
|
|
||||||
@@ -80,7 +79,6 @@ project/
|
|||||||
- **Config**: `services/<name>/deploy/test/docker-compose.yml`
|
- **Config**: `services/<name>/deploy/test/docker-compose.yml`
|
||||||
- **Dependencies**: Real database, mocked external APIs
|
- **Dependencies**: Real database, mocked external APIs
|
||||||
- **Containers**: Service + its direct dependencies only
|
- **Containers**: Service + its direct dependencies only
|
||||||
- **Speed**: 30-60 seconds per service
|
|
||||||
|
|
||||||
### Stage 3: Integration Tests
|
### Stage 3: Integration Tests
|
||||||
|
|
||||||
@@ -88,7 +86,24 @@ project/
|
|||||||
- **Config**: `deploy/test/docker-compose.yml`
|
- **Config**: `deploy/test/docker-compose.yml`
|
||||||
- **Dependencies**: All services running
|
- **Dependencies**: All services running
|
||||||
- **Containers**: Full stack
|
- **Containers**: Full stack
|
||||||
- **Speed**: 1-3 minutes
|
|
||||||
|
### Test Performance Optimization
|
||||||
|
|
||||||
|
**Requirement**: Minimize total test execution time. Slow tests waste developer time and CI resources.
|
||||||
|
|
||||||
|
- **Time all tests**: Use pytest's `--durations=10` to identify slowest tests
|
||||||
|
- **Investigate outliers**: Tests taking >1s in unit tests or >5s in integration tests need review
|
||||||
|
- **Common culprit**: Tests waiting for timeouts instead of using condition-based assertions
|
||||||
|
```python
|
||||||
|
# BAD: Waits full 5 seconds even if ready immediately
|
||||||
|
await asyncio.sleep(5)
|
||||||
|
assert result.is_ready()
|
||||||
|
|
||||||
|
# GOOD: Returns as soon as condition is met
|
||||||
|
await wait_for(lambda: result.is_ready(), timeout=5)
|
||||||
|
```
|
||||||
|
- **Cache expensive setup**: Use `pytest` fixtures with appropriate scope (`module`, `session`)
|
||||||
|
- **Parallelize**: Use `pytest-xdist` for CPU-bound test suites
|
||||||
|
|
||||||
## Test Environment Isolation
|
## Test Environment Isolation
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user