feat: mlx-proxy 서버 + n8n 워크플로우 LLM/임베딩 URL 분리

mlx-vlm 기반 ollama 호환 프록시 서버 추가 (port 11435).
n8n GEN 노드 6개에 callLLM 래퍼 주입 (health check + ollama fallback).
임베딩/리랭커는 ollama(LOCAL_EMBED_URL)로 분리.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Hyungi Ahn
2026-03-19 10:00:00 +09:00
parent a050f2e7d5
commit 1137754964
5 changed files with 162 additions and 14 deletions

27
com.mlx-proxy.plist Normal file
View File

@@ -0,0 +1,27 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.mlx-proxy</string>
<key>ProgramArguments</key>
<array>
<string>/Users/hyungi/mlx-env/bin/uvicorn</string>
<string>mlx_proxy:app</string>
<string>--host</string>
<string>0.0.0.0</string>
<string>--port</string>
<string>11435</string>
</array>
<key>WorkingDirectory</key>
<string>/Users/hyungi/Documents/code/syn-chat-bot</string>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<true/>
<key>StandardOutPath</key>
<string>/tmp/mlx-proxy.log</string>
<key>StandardErrorPath</key>
<string>/tmp/mlx-proxy.err</string>
</dict>
</plist>