feat: mlx-proxy 서버 + n8n 워크플로우 LLM/임베딩 URL 분리
mlx-vlm 기반 ollama 호환 프록시 서버 추가 (port 11435). n8n GEN 노드 6개에 callLLM 래퍼 주입 (health check + ollama fallback). 임베딩/리랭커는 ollama(LOCAL_EMBED_URL)로 분리. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -11,6 +11,7 @@ SERVICES=(
|
||||
"com.syn-chat-bot.mail-bridge"
|
||||
"com.syn-chat-bot.inbox-processor"
|
||||
"com.syn-chat-bot.news-digest"
|
||||
"com.mlx-proxy"
|
||||
)
|
||||
|
||||
PLIST_DIR="$HOME/Library/LaunchAgents"
|
||||
|
||||
Reference in New Issue
Block a user