Files
gpu-services/nanoclaude/config.py
Hyungi Ahn d946b769e5 feat: NanoClaude Phase 1 — 비동기 job 기반 AI Gateway 코어 구현
POST /chat → job_id ACK, GET /chat/{job_id}/stream → SSE 스트리밍,
EXAONE Ollama adapter, JobManager, StateStream, Worker 구조

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:12:04 +09:00

22 lines
469 B
Python

from pydantic_settings import BaseSettings
class Settings(BaseSettings):
# EXAONE via Ollama
exaone_base_url: str = "http://localhost:11434"
exaone_model: str = "exaone:7.8b"
exaone_temperature: float = 0.7
exaone_timeout: float = 120.0
# Server
host: str = "0.0.0.0"
port: int = 8100
# Optional API key (empty = disabled)
api_key: str = ""
model_config = {"env_file": ".env", "extra": "ignore"}
settings = Settings()