mirror of
https://github.com/Xe138/AI-Trader.git
synced 2026-04-01 17:17:24 -04:00
Major architecture transformation from batch-only to API service with
database persistence for Windmill integration.
## REST API Implementation
- POST /simulate/trigger - Start simulation jobs
- GET /simulate/status/{job_id} - Monitor job progress
- GET /results - Query results with filters (job_id, date, model)
- GET /health - Service health checks
## Database Layer
- SQLite persistence with 6 tables (jobs, job_details, positions,
holdings, reasoning_logs, tool_usage)
- Foreign key constraints with cascade deletes
- Replaces JSONL file storage
## Backend Components
- JobManager: Job lifecycle management with concurrency control
- RuntimeConfigManager: Thread-safe isolated runtime configs
- ModelDayExecutor: Single model-day execution engine
- SimulationWorker: Date-sequential, model-parallel orchestration
## Testing
- 102 unit and integration tests (85% coverage)
- Database: 98% coverage
- Job manager: 98% coverage
- API endpoints: 81% coverage
- Pydantic models: 100% coverage
- TDD approach throughout
## Docker Deployment
- Dual-mode: API server (persistent) + batch (one-time)
- Health checks with 30s interval
- Volume persistence for database and logs
- Separate entrypoints for each mode
## Validation Tools
- scripts/validate_docker_build.sh - Build validation
- scripts/test_api_endpoints.sh - Complete API testing
- scripts/test_batch_mode.sh - Batch mode validation
- DOCKER_API.md - Deployment guide
- TESTING_GUIDE.md - Testing procedures
## Configuration
- API_PORT environment variable (default: 8080)
- Backwards compatible with existing configs
- FastAPI, uvicorn, pydantic>=2.0 dependencies
Co-Authored-By: AI Assistant <noreply@example.com>
39 lines
925 B
Docker
39 lines
925 B
Docker
# Base stage - dependency installation
|
|
FROM python:3.10-slim AS base
|
|
|
|
WORKDIR /app
|
|
|
|
# Install dependencies
|
|
COPY requirements.txt .
|
|
RUN pip install --no-cache-dir -r requirements.txt
|
|
|
|
# Application stage
|
|
FROM base
|
|
|
|
WORKDIR /app
|
|
|
|
# Copy application code
|
|
COPY . .
|
|
|
|
# Copy data scripts to separate directory (volume mount won't overlay these)
|
|
RUN mkdir -p /app/scripts && \
|
|
cp data/get_daily_price.py /app/scripts/ && \
|
|
cp data/get_interdaily_price.py /app/scripts/ && \
|
|
cp data/merge_jsonl.py /app/scripts/
|
|
|
|
# Create necessary directories
|
|
RUN mkdir -p data logs data/agent_data
|
|
|
|
# Make entrypoints executable
|
|
RUN chmod +x entrypoint.sh entrypoint-api.sh
|
|
|
|
# Expose MCP service ports, API server, and web dashboard
|
|
EXPOSE 8000 8001 8002 8003 8080 8888
|
|
|
|
# Set Python to run unbuffered for real-time logs
|
|
ENV PYTHONUNBUFFERED=1
|
|
|
|
# Use entrypoint script
|
|
ENTRYPOINT ["./entrypoint.sh"]
|
|
CMD ["configs/default_config.json"]
|