Commit Graph

9 Commits

Author SHA1 Message Date
1f41e9d7ca feat: add skip status tracking for job orchestration
Implement skip status tracking to fix jobs hanging when dates are
filtered out. Jobs now properly complete when all model-days reach
terminal states (completed/failed/skipped).

Changes:
- database.py: Add 'skipped' status to job_details CHECK constraint
- job_manager.py: Update completion logic to count skipped as done
- job_manager.py: Add skipped count to progress tracking
- simulation_worker.py: Implement skip tracking with per-model granularity
- simulation_worker.py: Add _filter_completed_dates_with_tracking()
- simulation_worker.py: Add _mark_skipped_dates()
- simulation_worker.py: Update _prepare_data() to use skip tracking
- simulation_worker.py: Improve warning messages to distinguish skip types

Skip reasons:
- "Already completed" - Position data exists from previous job
- "Incomplete price data" - Missing prices (weekends/holidays/future)

The implementation correctly handles multi-model scenarios where different
models have different completion states for the same date.
2025-11-02 09:35:58 -05:00
3535746eb7 fix: simplify database migration for pre-production
Remove complex table recreation logic since the server hasn't been
deployed yet. For existing databases, simply delete and recreate.

The dev database is already recreated on startup by design.

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-02 07:23:58 -05:00
baa44c208a fix: add migration logic for warnings column and update tests
Critical fixes identified in code review:

1. Add warnings column migration to _migrate_schema()
   - Checks if warnings column exists in jobs table
   - Adds column via ALTER TABLE if missing
   - Ensures existing databases get new column on upgrade

2. Document CHECK constraint limitation
   - Added docstring explaining ALTER TABLE cannot add CHECK constraints
   - Notes that "downloading_data" status requires fresh DB or manual migration

3. Add comprehensive migration tests
   - test_migration_adds_warnings_column: Verifies warnings column migration
   - test_migration_adds_simulation_run_id_column: Tests existing migration
   - Both tests include cleanup to prevent cross-test contamination

4. Update test fixtures and expectations
   - Updated clean_db fixture to delete from all 9 tables
   - Fixed table count assertions (6 -> 9 tables)
   - Updated expected columns in schema tests

All 21 database tests now pass.
2025-11-01 23:17:25 -04:00
711ae5df73 feat(db): add downloading_data status and warnings column
Add support for:
- downloading_data job status for visibility during data prep
- warnings TEXT column for storing job-level warnings (JSON array)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 23:10:01 -04:00
837962ceea feat: integrate deployment mode path resolution in database module 2025-11-01 11:22:03 -04:00
8fb2ead8ff feat: add dev database initialization and cleanup functions 2025-11-01 11:20:15 -04:00
2575e0c12a fix: add database schema migration for simulation_run_id column
Add automatic schema migration to handle existing databases that don't
have the simulation_run_id column in the positions table.

Problem:
- v0.3.0-alpha.3 databases lack simulation_run_id column
- CREATE TABLE IF NOT EXISTS doesn't add new columns to existing tables
- Index creation fails with "no such column: simulation_run_id"

Solution:
- Add _migrate_schema() function to detect and migrate old schemas
- Check if positions table exists and inspect its columns
- ALTER TABLE to add simulation_run_id if missing
- Run migration before creating indexes

This allows seamless upgrades from alpha.3 to alpha.4 without manual
database deletion or migration scripts.

Fixes docker compose startup error:
  sqlite3.OperationalError: no such column: simulation_run_id

Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-31 18:41:38 -04:00
bddf4d8b72 feat: add price data management infrastructure (WIP)
Phase 1 of v0.3.0 date range and on-demand download implementation.

Database changes:
- Add price_data table (OHLCV data, replaces merged.jsonl)
- Add price_data_coverage table (track downloaded date ranges)
- Add simulation_runs table (soft delete support)
- Add simulation_run_id to positions table
- Add comprehensive indexes for new tables

New modules:
- api/price_data_manager.py - Priority-based download manager
  - Coverage gap detection
  - Smart download prioritization (maximize date completion)
  - Rate limit handling with retry logic
  - Alpha Vantage integration

Configuration:
- configs/nasdaq100_symbols.json - NASDAQ 100 constituent list

Next steps (not yet implemented):
- Migration script for merged.jsonl -> price_data
- Update API models (start_date/end_date)
- Update tools/price_tools.py to read from database
- Simulation run tracking implementation
- API integration
- Tests and documentation

This is work in progress for the v0.3.0 release.
2025-10-31 16:37:14 -04:00
fb9583b374 feat: transform to REST API service with SQLite persistence (v0.3.0)
Major architecture transformation from batch-only to API service with
database persistence for Windmill integration.

## REST API Implementation
- POST /simulate/trigger - Start simulation jobs
- GET /simulate/status/{job_id} - Monitor job progress
- GET /results - Query results with filters (job_id, date, model)
- GET /health - Service health checks

## Database Layer
- SQLite persistence with 6 tables (jobs, job_details, positions,
  holdings, reasoning_logs, tool_usage)
- Foreign key constraints with cascade deletes
- Replaces JSONL file storage

## Backend Components
- JobManager: Job lifecycle management with concurrency control
- RuntimeConfigManager: Thread-safe isolated runtime configs
- ModelDayExecutor: Single model-day execution engine
- SimulationWorker: Date-sequential, model-parallel orchestration

## Testing
- 102 unit and integration tests (85% coverage)
- Database: 98% coverage
- Job manager: 98% coverage
- API endpoints: 81% coverage
- Pydantic models: 100% coverage
- TDD approach throughout

## Docker Deployment
- Dual-mode: API server (persistent) + batch (one-time)
- Health checks with 30s interval
- Volume persistence for database and logs
- Separate entrypoints for each mode

## Validation Tools
- scripts/validate_docker_build.sh - Build validation
- scripts/test_api_endpoints.sh - Complete API testing
- scripts/test_batch_mode.sh - Batch mode validation
- DOCKER_API.md - Deployment guide
- TESTING_GUIDE.md - Testing procedures

## Configuration
- API_PORT environment variable (default: 8080)
- Backwards compatible with existing configs
- FastAPI, uvicorn, pydantic>=2.0 dependencies

Co-Authored-By: AI Assistant <noreply@example.com>
2025-10-31 11:47:10 -04:00