feat: mlx-proxy 서버 + n8n 워크플로우 LLM/임베딩 URL 분리

mlx-vlm 기반 ollama 호환 프록시 서버 추가 (port 11435).
n8n GEN 노드 6개에 callLLM 래퍼 주입 (health check + ollama fallback).
임베딩/리랭커는 ollama(LOCAL_EMBED_URL)로 분리.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Hyungi Ahn
2026-03-19 10:00:00 +09:00
parent a050f2e7d5
commit 1137754964
5 changed files with 162 additions and 14 deletions

View File

@@ -11,6 +11,7 @@ SERVICES=(
"com.syn-chat-bot.mail-bridge"
"com.syn-chat-bot.inbox-processor"
"com.syn-chat-bot.news-digest"
"com.mlx-proxy"
)
PLIST_DIR="$HOME/Library/LaunchAgents"