mirror of
https://github.com/Xe138/AI-Trader.git
synced 2026-04-04 18:07:24 -04:00
docs: restructure documentation for improved clarity and navigation
Reorganize documentation into user-focused, developer-focused, and deployment-focused sections. **New structure:** - Root: README.md (streamlined), QUICK_START.md, API_REFERENCE.md - docs/user-guide/: configuration, API usage, integrations, troubleshooting - docs/developer/: contributing, development setup, testing, architecture - docs/deployment/: Docker deployment, production checklist, monitoring - docs/reference/: environment variables, MCP tools, data formats **Changes:** - Streamline README.md from 831 to 469 lines - Create QUICK_START.md for 5-minute onboarding - Create API_REFERENCE.md as single source of truth for API - Remove 9 outdated specification docs (v0.2.0 API design) - Remove DOCKER_API.md (content consolidated into new structure) - Remove docs/plans/ directory with old design documents - Update CLAUDE.md with documentation structure guide - Remove orchestration-specific references **Benefits:** - Clear entry points for different audiences - No content duplication - Better discoverability through logical hierarchy - All content reflects current v0.3.0 API
This commit is contained in:
327
docs/user-guide/configuration.md
Normal file
327
docs/user-guide/configuration.md
Normal file
@@ -0,0 +1,327 @@
|
||||
# Configuration Guide
|
||||
|
||||
Complete guide to configuring AI-Trader.
|
||||
|
||||
---
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Set in `.env` file in project root.
|
||||
|
||||
### Required Variables
|
||||
|
||||
```bash
|
||||
# OpenAI API (or compatible endpoint)
|
||||
OPENAI_API_KEY=sk-your-key-here
|
||||
|
||||
# Alpha Vantage (price data)
|
||||
ALPHAADVANTAGE_API_KEY=your-key-here
|
||||
|
||||
# Jina AI (market intelligence search)
|
||||
JINA_API_KEY=your-key-here
|
||||
```
|
||||
|
||||
### Optional Variables
|
||||
|
||||
```bash
|
||||
# API Server Configuration
|
||||
API_PORT=8080 # Host port mapping (default: 8080)
|
||||
API_HOST=0.0.0.0 # Bind address (default: 0.0.0.0)
|
||||
|
||||
# OpenAI Configuration
|
||||
OPENAI_API_BASE=https://api.openai.com/v1 # Custom endpoint
|
||||
|
||||
# Simulation Limits
|
||||
MAX_CONCURRENT_JOBS=1 # Max simultaneous jobs (default: 1)
|
||||
MAX_SIMULATION_DAYS=30 # Max date range per job (default: 30)
|
||||
|
||||
# Price Data Management
|
||||
AUTO_DOWNLOAD_PRICE_DATA=true # Auto-fetch missing data (default: true)
|
||||
|
||||
# Agent Configuration
|
||||
AGENT_MAX_STEP=30 # Max reasoning steps per day (default: 30)
|
||||
|
||||
# Volume Paths
|
||||
VOLUME_PATH=. # Base directory for data (default: .)
|
||||
|
||||
# MCP Service Ports (usually don't need to change)
|
||||
MATH_HTTP_PORT=8000
|
||||
SEARCH_HTTP_PORT=8001
|
||||
TRADE_HTTP_PORT=8002
|
||||
GETPRICE_HTTP_PORT=8003
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Model Configuration
|
||||
|
||||
Edit `configs/default_config.json` to define available AI models.
|
||||
|
||||
### Configuration Structure
|
||||
|
||||
```json
|
||||
{
|
||||
"agent_type": "BaseAgent",
|
||||
"date_range": {
|
||||
"init_date": "2025-01-01",
|
||||
"end_date": "2025-01-31"
|
||||
},
|
||||
"models": [
|
||||
{
|
||||
"name": "GPT-4",
|
||||
"basemodel": "openai/gpt-4",
|
||||
"signature": "gpt-4",
|
||||
"enabled": true
|
||||
}
|
||||
],
|
||||
"agent_config": {
|
||||
"max_steps": 30,
|
||||
"max_retries": 3,
|
||||
"initial_cash": 10000.0
|
||||
},
|
||||
"log_config": {
|
||||
"log_path": "./data/agent_data"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Model Configuration Fields
|
||||
|
||||
| Field | Required | Description |
|
||||
|-------|----------|-------------|
|
||||
| `name` | Yes | Display name for the model |
|
||||
| `basemodel` | Yes | Model identifier (e.g., `openai/gpt-4`, `anthropic/claude-3.7-sonnet`) |
|
||||
| `signature` | Yes | Unique identifier used in API requests and database |
|
||||
| `enabled` | Yes | Whether this model runs when no models specified in API request |
|
||||
| `openai_base_url` | No | Custom API endpoint for this model |
|
||||
| `openai_api_key` | No | Model-specific API key (overrides `OPENAI_API_KEY` env var) |
|
||||
|
||||
### Adding Custom Models
|
||||
|
||||
**Example: Add Claude 3.7 Sonnet**
|
||||
|
||||
```json
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"name": "Claude 3.7 Sonnet",
|
||||
"basemodel": "anthropic/claude-3.7-sonnet",
|
||||
"signature": "claude-3.7-sonnet",
|
||||
"enabled": true,
|
||||
"openai_base_url": "https://api.anthropic.com/v1",
|
||||
"openai_api_key": "your-anthropic-key"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**Example: Add DeepSeek via OpenRouter**
|
||||
|
||||
```json
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"name": "DeepSeek",
|
||||
"basemodel": "deepseek/deepseek-chat",
|
||||
"signature": "deepseek",
|
||||
"enabled": true,
|
||||
"openai_base_url": "https://openrouter.ai/api/v1",
|
||||
"openai_api_key": "your-openrouter-key"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Agent Configuration
|
||||
|
||||
| Field | Description | Default |
|
||||
|-------|-------------|---------|
|
||||
| `max_steps` | Maximum reasoning iterations per trading day | 30 |
|
||||
| `max_retries` | Retry attempts on API failures | 3 |
|
||||
| `initial_cash` | Starting capital per model | 10000.0 |
|
||||
|
||||
---
|
||||
|
||||
## Port Configuration
|
||||
|
||||
### Default Ports
|
||||
|
||||
| Service | Internal Port | Host Port (configurable) |
|
||||
|---------|---------------|--------------------------|
|
||||
| API Server | 8080 | `API_PORT` (default: 8080) |
|
||||
| MCP Math | 8000 | Not exposed to host |
|
||||
| MCP Search | 8001 | Not exposed to host |
|
||||
| MCP Trade | 8002 | Not exposed to host |
|
||||
| MCP Price | 8003 | Not exposed to host |
|
||||
|
||||
### Changing API Port
|
||||
|
||||
If port 8080 is already in use:
|
||||
|
||||
```bash
|
||||
# Add to .env
|
||||
echo "API_PORT=8889" >> .env
|
||||
|
||||
# Restart
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
|
||||
# Access on new port
|
||||
curl http://localhost:8889/health
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Volume Configuration
|
||||
|
||||
Docker volumes persist data across container restarts:
|
||||
|
||||
```yaml
|
||||
volumes:
|
||||
- ./data:/app/data # Database, price data, agent data
|
||||
- ./configs:/app/configs # Configuration files
|
||||
- ./logs:/app/logs # Application logs
|
||||
```
|
||||
|
||||
### Data Directory Structure
|
||||
|
||||
```
|
||||
data/
|
||||
├── jobs.db # SQLite database
|
||||
├── merged.jsonl # Cached price data
|
||||
├── daily_prices_*.json # Individual stock data
|
||||
├── price_coverage.json # Data availability tracking
|
||||
└── agent_data/ # Agent execution data
|
||||
└── {signature}/
|
||||
├── position/
|
||||
│ └── position.jsonl # Trading positions
|
||||
└── log/
|
||||
└── {date}/
|
||||
└── log.jsonl # Trading logs
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API Key Setup
|
||||
|
||||
### OpenAI API Key
|
||||
|
||||
1. Visit [platform.openai.com/api-keys](https://platform.openai.com/api-keys)
|
||||
2. Create new key
|
||||
3. Add to `.env`:
|
||||
```bash
|
||||
OPENAI_API_KEY=sk-...
|
||||
```
|
||||
|
||||
### Alpha Vantage API Key
|
||||
|
||||
1. Visit [alphavantage.co/support/#api-key](https://www.alphavantage.co/support/#api-key)
|
||||
2. Get free key (5 req/min) or premium (75 req/min)
|
||||
3. Add to `.env`:
|
||||
```bash
|
||||
ALPHAADVANTAGE_API_KEY=...
|
||||
```
|
||||
|
||||
### Jina AI API Key
|
||||
|
||||
1. Visit [jina.ai](https://jina.ai/)
|
||||
2. Sign up for free tier
|
||||
3. Add to `.env`:
|
||||
```bash
|
||||
JINA_API_KEY=...
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Configuration Examples
|
||||
|
||||
### Development Setup
|
||||
|
||||
```bash
|
||||
# .env
|
||||
API_PORT=8080
|
||||
MAX_CONCURRENT_JOBS=1
|
||||
MAX_SIMULATION_DAYS=5 # Limit for faster testing
|
||||
AUTO_DOWNLOAD_PRICE_DATA=true
|
||||
AGENT_MAX_STEP=10 # Fewer steps for faster iteration
|
||||
```
|
||||
|
||||
### Production Setup
|
||||
|
||||
```bash
|
||||
# .env
|
||||
API_PORT=8080
|
||||
MAX_CONCURRENT_JOBS=1
|
||||
MAX_SIMULATION_DAYS=30
|
||||
AUTO_DOWNLOAD_PRICE_DATA=true
|
||||
AGENT_MAX_STEP=30
|
||||
```
|
||||
|
||||
### Multi-Model Competition
|
||||
|
||||
```json
|
||||
// configs/default_config.json
|
||||
{
|
||||
"models": [
|
||||
{
|
||||
"name": "GPT-4",
|
||||
"basemodel": "openai/gpt-4",
|
||||
"signature": "gpt-4",
|
||||
"enabled": true
|
||||
},
|
||||
{
|
||||
"name": "Claude 3.7",
|
||||
"basemodel": "anthropic/claude-3.7-sonnet",
|
||||
"signature": "claude-3.7",
|
||||
"enabled": true,
|
||||
"openai_base_url": "https://api.anthropic.com/v1",
|
||||
"openai_api_key": "anthropic-key"
|
||||
},
|
||||
{
|
||||
"name": "GPT-3.5 Turbo",
|
||||
"basemodel": "openai/gpt-3.5-turbo",
|
||||
"signature": "gpt-3.5-turbo",
|
||||
"enabled": false // Not run by default
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Environment Variable Priority
|
||||
|
||||
When the same configuration exists in multiple places:
|
||||
|
||||
1. **API request parameters** (highest priority)
|
||||
2. **Model-specific config** (`openai_base_url`, `openai_api_key` in model config)
|
||||
3. **Environment variables** (`.env` file)
|
||||
4. **Default values** (lowest priority)
|
||||
|
||||
Example:
|
||||
```json
|
||||
// If model config has:
|
||||
{
|
||||
"openai_api_key": "model-specific-key"
|
||||
}
|
||||
|
||||
// This overrides OPENAI_API_KEY from .env
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Validation
|
||||
|
||||
After configuration changes:
|
||||
|
||||
```bash
|
||||
# Restart service
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
|
||||
# Verify health
|
||||
curl http://localhost:8080/health
|
||||
|
||||
# Check logs for errors
|
||||
docker logs ai-trader | grep -i error
|
||||
```
|
||||
197
docs/user-guide/integration-examples.md
Normal file
197
docs/user-guide/integration-examples.md
Normal file
@@ -0,0 +1,197 @@
|
||||
# Integration Examples
|
||||
|
||||
Examples for integrating AI-Trader with external systems.
|
||||
|
||||
---
|
||||
|
||||
## Python
|
||||
|
||||
See complete Python client in [API_REFERENCE.md](../../API_REFERENCE.md#client-libraries).
|
||||
|
||||
### Async Client
|
||||
|
||||
```python
|
||||
import aiohttp
|
||||
import asyncio
|
||||
|
||||
class AsyncAITraderClient:
|
||||
def __init__(self, base_url="http://localhost:8080"):
|
||||
self.base_url = base_url
|
||||
|
||||
async def trigger_simulation(self, start_date, end_date=None, models=None):
|
||||
payload = {"start_date": start_date}
|
||||
if end_date:
|
||||
payload["end_date"] = end_date
|
||||
if models:
|
||||
payload["models"] = models
|
||||
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.post(
|
||||
f"{self.base_url}/simulate/trigger",
|
||||
json=payload
|
||||
) as response:
|
||||
response.raise_for_status()
|
||||
return await response.json()
|
||||
|
||||
async def wait_for_completion(self, job_id, poll_interval=10):
|
||||
async with aiohttp.ClientSession() as session:
|
||||
while True:
|
||||
async with session.get(
|
||||
f"{self.base_url}/simulate/status/{job_id}"
|
||||
) as response:
|
||||
status = await response.json()
|
||||
|
||||
if status["status"] in ["completed", "partial", "failed"]:
|
||||
return status
|
||||
|
||||
await asyncio.sleep(poll_interval)
|
||||
|
||||
# Usage
|
||||
async def main():
|
||||
client = AsyncAITraderClient()
|
||||
job = await client.trigger_simulation("2025-01-16", models=["gpt-4"])
|
||||
result = await client.wait_for_completion(job["job_id"])
|
||||
print(f"Simulation completed: {result['status']}")
|
||||
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## TypeScript/JavaScript
|
||||
|
||||
See complete TypeScript client in [API_REFERENCE.md](../../API_REFERENCE.md#client-libraries).
|
||||
|
||||
---
|
||||
|
||||
## Bash/Shell Scripts
|
||||
|
||||
### Daily Automation
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# daily_simulation.sh
|
||||
|
||||
API_URL="http://localhost:8080"
|
||||
DATE=$(date -d "yesterday" +%Y-%m-%d)
|
||||
|
||||
echo "Triggering simulation for $DATE"
|
||||
|
||||
# Trigger
|
||||
RESPONSE=$(curl -s -X POST $API_URL/simulate/trigger \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"start_date\": \"$DATE\", \"models\": [\"gpt-4\"]}")
|
||||
|
||||
JOB_ID=$(echo $RESPONSE | jq -r '.job_id')
|
||||
echo "Job ID: $JOB_ID"
|
||||
|
||||
# Poll
|
||||
while true; do
|
||||
STATUS=$(curl -s $API_URL/simulate/status/$JOB_ID | jq -r '.status')
|
||||
echo "Status: $STATUS"
|
||||
|
||||
if [[ "$STATUS" == "completed" ]] || [[ "$STATUS" == "partial" ]] || [[ "$STATUS" == "failed" ]]; then
|
||||
break
|
||||
fi
|
||||
|
||||
sleep 30
|
||||
done
|
||||
|
||||
# Get results
|
||||
curl -s "$API_URL/results?job_id=$JOB_ID" | jq '.' > results_$DATE.json
|
||||
echo "Results saved to results_$DATE.json"
|
||||
```
|
||||
|
||||
Add to crontab:
|
||||
```bash
|
||||
0 6 * * * /path/to/daily_simulation.sh >> /var/log/ai-trader.log 2>&1
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Apache Airflow
|
||||
|
||||
```python
|
||||
from airflow import DAG
|
||||
from airflow.operators.python import PythonOperator
|
||||
from datetime import datetime, timedelta
|
||||
import requests
|
||||
import time
|
||||
|
||||
def trigger_simulation(**context):
|
||||
response = requests.post(
|
||||
"http://ai-trader:8080/simulate/trigger",
|
||||
json={"start_date": "{{ ds }}", "models": ["gpt-4"]}
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()["job_id"]
|
||||
|
||||
def wait_for_completion(**context):
|
||||
job_id = context["task_instance"].xcom_pull(task_ids="trigger")
|
||||
|
||||
while True:
|
||||
response = requests.get(f"http://ai-trader:8080/simulate/status/{job_id}")
|
||||
status = response.json()
|
||||
|
||||
if status["status"] in ["completed", "partial", "failed"]:
|
||||
return status
|
||||
|
||||
time.sleep(30)
|
||||
|
||||
def fetch_results(**context):
|
||||
job_id = context["task_instance"].xcom_pull(task_ids="trigger")
|
||||
response = requests.get(f"http://ai-trader:8080/results?job_id={job_id}")
|
||||
return response.json()
|
||||
|
||||
default_args = {
|
||||
"owner": "airflow",
|
||||
"depends_on_past": False,
|
||||
"start_date": datetime(2025, 1, 1),
|
||||
"retries": 1,
|
||||
"retry_delay": timedelta(minutes=5),
|
||||
}
|
||||
|
||||
dag = DAG(
|
||||
"ai_trader_simulation",
|
||||
default_args=default_args,
|
||||
schedule_interval="0 6 * * *", # Daily at 6 AM
|
||||
catchup=False
|
||||
)
|
||||
|
||||
trigger_task = PythonOperator(
|
||||
task_id="trigger",
|
||||
python_callable=trigger_simulation,
|
||||
dag=dag
|
||||
)
|
||||
|
||||
wait_task = PythonOperator(
|
||||
task_id="wait",
|
||||
python_callable=wait_for_completion,
|
||||
dag=dag
|
||||
)
|
||||
|
||||
fetch_task = PythonOperator(
|
||||
task_id="fetch_results",
|
||||
python_callable=fetch_results,
|
||||
dag=dag
|
||||
)
|
||||
|
||||
trigger_task >> wait_task >> fetch_task
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Generic Workflow Automation
|
||||
|
||||
Any HTTP-capable automation service can integrate with AI-Trader:
|
||||
|
||||
1. **Trigger:** POST to `/simulate/trigger`
|
||||
2. **Poll:** GET `/simulate/status/{job_id}` every 10-30 seconds
|
||||
3. **Retrieve:** GET `/results?job_id={job_id}` when complete
|
||||
4. **Store:** Save results to your database/warehouse
|
||||
|
||||
**Key considerations:**
|
||||
- Handle 400 errors (concurrent jobs) gracefully
|
||||
- Implement exponential backoff for retries
|
||||
- Monitor health endpoint before triggering
|
||||
- Store job_id for tracking and debugging
|
||||
488
docs/user-guide/troubleshooting.md
Normal file
488
docs/user-guide/troubleshooting.md
Normal file
@@ -0,0 +1,488 @@
|
||||
# Troubleshooting Guide
|
||||
|
||||
Common issues and solutions for AI-Trader.
|
||||
|
||||
---
|
||||
|
||||
## Container Issues
|
||||
|
||||
### Container Won't Start
|
||||
|
||||
**Symptoms:**
|
||||
- `docker ps` shows no ai-trader container
|
||||
- Container exits immediately after starting
|
||||
|
||||
**Debug:**
|
||||
```bash
|
||||
# Check logs
|
||||
docker logs ai-trader
|
||||
|
||||
# Check if container exists (stopped)
|
||||
docker ps -a | grep ai-trader
|
||||
```
|
||||
|
||||
**Common Causes & Solutions:**
|
||||
|
||||
**1. Missing API Keys**
|
||||
```bash
|
||||
# Verify .env file
|
||||
cat .env | grep -E "OPENAI_API_KEY|ALPHAADVANTAGE_API_KEY|JINA_API_KEY"
|
||||
|
||||
# Should show all three keys with values
|
||||
```
|
||||
|
||||
**Solution:** Add missing keys to `.env`
|
||||
|
||||
**2. Port Already in Use**
|
||||
```bash
|
||||
# Check what's using port 8080
|
||||
sudo lsof -i :8080 # Linux/Mac
|
||||
netstat -ano | findstr :8080 # Windows
|
||||
```
|
||||
|
||||
**Solution:** Change port in `.env`:
|
||||
```bash
|
||||
echo "API_PORT=8889" >> .env
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
**3. Volume Permission Issues**
|
||||
```bash
|
||||
# Fix permissions
|
||||
chmod -R 755 data logs configs
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Health Check Fails
|
||||
|
||||
**Symptoms:**
|
||||
- `curl http://localhost:8080/health` returns error or HTML page
|
||||
- Container running but API not responding
|
||||
|
||||
**Debug:**
|
||||
```bash
|
||||
# Check if API process is running
|
||||
docker exec ai-trader ps aux | grep uvicorn
|
||||
|
||||
# Test internal health (always port 8080 inside container)
|
||||
docker exec ai-trader curl http://localhost:8080/health
|
||||
|
||||
# Check configured port
|
||||
grep API_PORT .env
|
||||
```
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**If you get HTML 404 page:**
|
||||
Another service is using your configured port.
|
||||
|
||||
```bash
|
||||
# Find conflicting service
|
||||
sudo lsof -i :8080
|
||||
|
||||
# Change AI-Trader port
|
||||
echo "API_PORT=8889" >> .env
|
||||
docker-compose down
|
||||
docker-compose up -d
|
||||
|
||||
# Now use new port
|
||||
curl http://localhost:8889/health
|
||||
```
|
||||
|
||||
**If MCP services didn't start:**
|
||||
```bash
|
||||
# Check MCP processes
|
||||
docker exec ai-trader ps aux | grep python
|
||||
|
||||
# Should see 4 MCP services on ports 8000-8003
|
||||
```
|
||||
|
||||
**If database issues:**
|
||||
```bash
|
||||
# Check database file
|
||||
docker exec ai-trader ls -l /app/data/jobs.db
|
||||
|
||||
# If missing, restart to recreate
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Simulation Issues
|
||||
|
||||
### Job Stays in "Pending" Status
|
||||
|
||||
**Symptoms:**
|
||||
- Job triggered but never progresses to "running"
|
||||
- Status remains "pending" indefinitely
|
||||
|
||||
**Debug:**
|
||||
```bash
|
||||
# Check worker logs
|
||||
docker logs ai-trader | grep -i "worker\|simulation"
|
||||
|
||||
# Check database
|
||||
docker exec ai-trader sqlite3 /app/data/jobs.db "SELECT * FROM job_details;"
|
||||
|
||||
# Check MCP service accessibility
|
||||
docker exec ai-trader curl http://localhost:8000/health
|
||||
```
|
||||
|
||||
**Solutions:**
|
||||
|
||||
```bash
|
||||
# Restart container (jobs resume automatically)
|
||||
docker-compose restart
|
||||
|
||||
# Check specific job status with details
|
||||
curl http://localhost:8080/simulate/status/$JOB_ID | jq '.details'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Job Takes Too Long / Timeouts
|
||||
|
||||
**Symptoms:**
|
||||
- Jobs taking longer than expected
|
||||
- Test scripts timing out
|
||||
|
||||
**Expected Execution Times:**
|
||||
- Single model-day: 2-5 minutes (with cached price data)
|
||||
- First run with data download: 10-15 minutes
|
||||
- 2-date, 2-model job: 10-20 minutes
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**Increase poll timeout in monitoring:**
|
||||
```bash
|
||||
# Instead of fixed polling, use this
|
||||
while true; do
|
||||
STATUS=$(curl -s http://localhost:8080/simulate/status/$JOB_ID | jq -r '.status')
|
||||
echo "$(date): Status = $STATUS"
|
||||
|
||||
if [[ "$STATUS" == "completed" ]] || [[ "$STATUS" == "partial" ]] || [[ "$STATUS" == "failed" ]]; then
|
||||
break
|
||||
fi
|
||||
|
||||
sleep 30
|
||||
done
|
||||
```
|
||||
|
||||
**Check if agent is stuck:**
|
||||
```bash
|
||||
# View real-time logs
|
||||
docker logs -f ai-trader
|
||||
|
||||
# Look for repeated errors or infinite loops
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### "No trading dates with complete price data"
|
||||
|
||||
**Error Message:**
|
||||
```
|
||||
No trading dates with complete price data in range 2025-01-16 to 2025-01-17.
|
||||
All symbols must have data for a date to be tradeable.
|
||||
```
|
||||
|
||||
**Cause:** Missing price data for requested dates.
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**Option 1: Try Recent Dates**
|
||||
|
||||
Use more recent dates where data is more likely available:
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"start_date": "2024-12-15", "models": ["gpt-4"]}'
|
||||
```
|
||||
|
||||
**Option 2: Manually Download Data**
|
||||
|
||||
```bash
|
||||
docker exec -it ai-trader bash
|
||||
cd data
|
||||
python get_daily_price.py # Downloads latest data
|
||||
python merge_jsonl.py # Merges into database
|
||||
exit
|
||||
|
||||
# Retry simulation
|
||||
```
|
||||
|
||||
**Option 3: Check Auto-Download Setting**
|
||||
|
||||
```bash
|
||||
# Ensure auto-download is enabled
|
||||
grep AUTO_DOWNLOAD_PRICE_DATA .env
|
||||
|
||||
# Should be: AUTO_DOWNLOAD_PRICE_DATA=true
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Rate Limit Errors
|
||||
|
||||
**Symptoms:**
|
||||
- Logs show "rate limit" messages
|
||||
- Partial data downloaded
|
||||
|
||||
**Cause:** Alpha Vantage API rate limits (5 req/min free tier, 75 req/min premium)
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**For free tier:**
|
||||
- Simulations automatically continue with available data
|
||||
- Next simulation resumes downloads
|
||||
- Consider upgrading to premium API key
|
||||
|
||||
**Workaround:**
|
||||
```bash
|
||||
# Pre-download data in batches
|
||||
docker exec -it ai-trader bash
|
||||
cd data
|
||||
|
||||
# Download in stages (wait 1 min between runs)
|
||||
python get_daily_price.py
|
||||
sleep 60
|
||||
python get_daily_price.py
|
||||
sleep 60
|
||||
python get_daily_price.py
|
||||
|
||||
python merge_jsonl.py
|
||||
exit
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API Issues
|
||||
|
||||
### 400 Bad Request: Another Job Running
|
||||
|
||||
**Error:**
|
||||
```json
|
||||
{
|
||||
"detail": "Another simulation job is already running or pending. Please wait for it to complete."
|
||||
}
|
||||
```
|
||||
|
||||
**Cause:** AI-Trader allows only 1 concurrent job by default.
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**Check current jobs:**
|
||||
```bash
|
||||
# Find running job
|
||||
curl http://localhost:8080/health # Verify API is up
|
||||
|
||||
# Query recent jobs (need to check database)
|
||||
docker exec ai-trader sqlite3 /app/data/jobs.db \
|
||||
"SELECT job_id, status FROM jobs ORDER BY created_at DESC LIMIT 5;"
|
||||
```
|
||||
|
||||
**Wait for completion:**
|
||||
```bash
|
||||
# Get the blocking job's status
|
||||
curl http://localhost:8080/simulate/status/{job_id}
|
||||
```
|
||||
|
||||
**Force-stop stuck job (last resort):**
|
||||
```bash
|
||||
# Update job status in database
|
||||
docker exec ai-trader sqlite3 /app/data/jobs.db \
|
||||
"UPDATE jobs SET status='failed' WHERE status IN ('pending', 'running');"
|
||||
|
||||
# Restart service
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Invalid Date Format Errors
|
||||
|
||||
**Error:**
|
||||
```json
|
||||
{
|
||||
"detail": "Invalid date format: 2025-1-16. Expected YYYY-MM-DD"
|
||||
}
|
||||
```
|
||||
|
||||
**Solution:** Use zero-padded dates:
|
||||
|
||||
```bash
|
||||
# Wrong
|
||||
{"start_date": "2025-1-16"}
|
||||
|
||||
# Correct
|
||||
{"start_date": "2025-01-16"}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Date Range Too Large
|
||||
|
||||
**Error:**
|
||||
```json
|
||||
{
|
||||
"detail": "Date range too large: 45 days. Maximum allowed: 30 days"
|
||||
}
|
||||
```
|
||||
|
||||
**Solution:** Split into smaller batches:
|
||||
|
||||
```bash
|
||||
# Instead of 2025-01-01 to 2025-02-15 (45 days)
|
||||
# Run as two jobs:
|
||||
|
||||
# Job 1: Jan 1-30
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-d '{"start_date": "2025-01-01", "end_date": "2025-01-30"}'
|
||||
|
||||
# Job 2: Jan 31 - Feb 15
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-d '{"start_date": "2025-01-31", "end_date": "2025-02-15"}'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Data Issues
|
||||
|
||||
### Database Corruption
|
||||
|
||||
**Symptoms:**
|
||||
- "database disk image is malformed"
|
||||
- Unexpected SQL errors
|
||||
|
||||
**Solutions:**
|
||||
|
||||
**Backup and rebuild:**
|
||||
```bash
|
||||
# Stop service
|
||||
docker-compose down
|
||||
|
||||
# Backup current database
|
||||
cp data/jobs.db data/jobs.db.backup
|
||||
|
||||
# Try recovery
|
||||
docker run --rm -v $(pwd)/data:/data alpine sqlite3 /data/jobs.db "PRAGMA integrity_check;"
|
||||
|
||||
# If corrupted, delete and restart (loses job history)
|
||||
rm data/jobs.db
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Missing Price Data Files
|
||||
|
||||
**Symptoms:**
|
||||
- Errors about missing `merged.jsonl`
|
||||
- Price query failures
|
||||
|
||||
**Solution:**
|
||||
|
||||
```bash
|
||||
# Re-download price data
|
||||
docker exec -it ai-trader bash
|
||||
cd data
|
||||
python get_daily_price.py
|
||||
python merge_jsonl.py
|
||||
ls -lh merged.jsonl # Should exist
|
||||
exit
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Performance Issues
|
||||
|
||||
### Slow Simulation Execution
|
||||
|
||||
**Typical speeds:**
|
||||
- Single model-day: 2-5 minutes
|
||||
- With cold start (first time): +3-5 minutes
|
||||
|
||||
**Causes & Solutions:**
|
||||
|
||||
**1. AI Model API is slow**
|
||||
- Check AI provider status page
|
||||
- Try different model
|
||||
- Increase timeout in config
|
||||
|
||||
**2. Network latency**
|
||||
- Check internet connection
|
||||
- Jina Search API might be slow
|
||||
|
||||
**3. MCP services overloaded**
|
||||
```bash
|
||||
# Check CPU usage
|
||||
docker stats ai-trader
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### High Memory Usage
|
||||
|
||||
**Normal:** 500MB - 1GB during simulation
|
||||
|
||||
**If higher:**
|
||||
```bash
|
||||
# Check memory
|
||||
docker stats ai-trader
|
||||
|
||||
# Restart if needed
|
||||
docker-compose restart
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Diagnostic Commands
|
||||
|
||||
```bash
|
||||
# Container status
|
||||
docker ps | grep ai-trader
|
||||
|
||||
# Real-time logs
|
||||
docker logs -f ai-trader
|
||||
|
||||
# Check errors only
|
||||
docker logs ai-trader 2>&1 | grep -i error
|
||||
|
||||
# Container resource usage
|
||||
docker stats ai-trader
|
||||
|
||||
# Access container shell
|
||||
docker exec -it ai-trader bash
|
||||
|
||||
# Database inspection
|
||||
docker exec -it ai-trader sqlite3 /app/data/jobs.db
|
||||
sqlite> SELECT * FROM jobs ORDER BY created_at DESC LIMIT 5;
|
||||
sqlite> SELECT status, COUNT(*) FROM jobs GROUP BY status;
|
||||
sqlite> .quit
|
||||
|
||||
# Check file permissions
|
||||
docker exec ai-trader ls -la /app/data
|
||||
|
||||
# Test API connectivity
|
||||
curl -v http://localhost:8080/health
|
||||
|
||||
# View all environment variables
|
||||
docker exec ai-trader env | sort
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Getting More Help
|
||||
|
||||
If your issue isn't covered here:
|
||||
|
||||
1. **Check logs** for specific error messages
|
||||
2. **Review** [API_REFERENCE.md](../../API_REFERENCE.md) for correct usage
|
||||
3. **Search** [GitHub Issues](https://github.com/Xe138/AI-Trader/issues)
|
||||
4. **Open new issue** with:
|
||||
- Error messages from logs
|
||||
- Steps to reproduce
|
||||
- Environment details (OS, Docker version)
|
||||
- Relevant config files (redact API keys)
|
||||
182
docs/user-guide/using-the-api.md
Normal file
182
docs/user-guide/using-the-api.md
Normal file
@@ -0,0 +1,182 @@
|
||||
# Using the API
|
||||
|
||||
Common workflows and best practices for AI-Trader API.
|
||||
|
||||
---
|
||||
|
||||
## Basic Workflow
|
||||
|
||||
### 1. Trigger Simulation
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"start_date": "2025-01-16",
|
||||
"end_date": "2025-01-17",
|
||||
"models": ["gpt-4"]
|
||||
}'
|
||||
```
|
||||
|
||||
Save the `job_id` from response.
|
||||
|
||||
### 2. Poll for Completion
|
||||
|
||||
```bash
|
||||
JOB_ID="your-job-id-here"
|
||||
|
||||
while true; do
|
||||
STATUS=$(curl -s http://localhost:8080/simulate/status/$JOB_ID | jq -r '.status')
|
||||
echo "Status: $STATUS"
|
||||
|
||||
if [[ "$STATUS" == "completed" ]] || [[ "$STATUS" == "partial" ]] || [[ "$STATUS" == "failed" ]]; then
|
||||
break
|
||||
fi
|
||||
|
||||
sleep 10
|
||||
done
|
||||
```
|
||||
|
||||
### 3. Retrieve Results
|
||||
|
||||
```bash
|
||||
curl "http://localhost:8080/results?job_id=$JOB_ID" | jq '.'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Single-Day Simulation
|
||||
|
||||
Omit `end_date` to simulate just one day:
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-d '{"start_date": "2025-01-16", "models": ["gpt-4"]}'
|
||||
```
|
||||
|
||||
### All Enabled Models
|
||||
|
||||
Omit `models` to run all enabled models from config:
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/simulate/trigger \
|
||||
-d '{"start_date": "2025-01-16", "end_date": "2025-01-20"}'
|
||||
```
|
||||
|
||||
### Filter Results
|
||||
|
||||
```bash
|
||||
# By date
|
||||
curl "http://localhost:8080/results?date=2025-01-16"
|
||||
|
||||
# By model
|
||||
curl "http://localhost:8080/results?model=gpt-4"
|
||||
|
||||
# Combined
|
||||
curl "http://localhost:8080/results?job_id=$JOB_ID&date=2025-01-16&model=gpt-4"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Check Health Before Triggering
|
||||
|
||||
```bash
|
||||
curl http://localhost:8080/health
|
||||
|
||||
# Only proceed if status is "healthy"
|
||||
```
|
||||
|
||||
### 2. Use Exponential Backoff for Retries
|
||||
|
||||
```python
|
||||
import time
|
||||
import requests
|
||||
|
||||
def trigger_with_retry(max_retries=3):
|
||||
for attempt in range(max_retries):
|
||||
try:
|
||||
response = requests.post(
|
||||
"http://localhost:8080/simulate/trigger",
|
||||
json={"start_date": "2025-01-16"}
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except requests.HTTPError as e:
|
||||
if e.response.status_code == 400:
|
||||
# Don't retry on validation errors
|
||||
raise
|
||||
wait = 2 ** attempt # 1s, 2s, 4s
|
||||
time.sleep(wait)
|
||||
|
||||
raise Exception("Max retries exceeded")
|
||||
```
|
||||
|
||||
### 3. Handle Concurrent Job Conflicts
|
||||
|
||||
```python
|
||||
response = requests.post(
|
||||
"http://localhost:8080/simulate/trigger",
|
||||
json={"start_date": "2025-01-16"}
|
||||
)
|
||||
|
||||
if response.status_code == 400 and "already running" in response.json()["detail"]:
|
||||
print("Another job is running. Waiting...")
|
||||
# Wait and retry, or query existing job status
|
||||
```
|
||||
|
||||
### 4. Monitor Progress with Details
|
||||
|
||||
```python
|
||||
def get_detailed_progress(job_id):
|
||||
response = requests.get(f"http://localhost:8080/simulate/status/{job_id}")
|
||||
status = response.json()
|
||||
|
||||
print(f"Overall: {status['status']}")
|
||||
print(f"Progress: {status['progress']['completed']}/{status['progress']['total_model_days']}")
|
||||
|
||||
# Show per-model-day status
|
||||
for detail in status['details']:
|
||||
print(f" {detail['trading_date']} {detail['model_signature']}: {detail['status']}")
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Validation Errors (400)
|
||||
|
||||
```python
|
||||
try:
|
||||
response = requests.post(
|
||||
"http://localhost:8080/simulate/trigger",
|
||||
json={"start_date": "2025-1-16"} # Wrong format
|
||||
)
|
||||
response.raise_for_status()
|
||||
except requests.HTTPError as e:
|
||||
if e.response.status_code == 400:
|
||||
print(f"Validation error: {e.response.json()['detail']}")
|
||||
# Fix input and retry
|
||||
```
|
||||
|
||||
### Service Unavailable (503)
|
||||
|
||||
```python
|
||||
try:
|
||||
response = requests.post(
|
||||
"http://localhost:8080/simulate/trigger",
|
||||
json={"start_date": "2025-01-16"}
|
||||
)
|
||||
response.raise_for_status()
|
||||
except requests.HTTPError as e:
|
||||
if e.response.status_code == 503:
|
||||
print("Service unavailable (likely price data download failed)")
|
||||
# Retry later or check ALPHAADVANTAGE_API_KEY
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
See [API_REFERENCE.md](../../API_REFERENCE.md) for complete endpoint documentation.
|
||||
Reference in New Issue
Block a user