Files
hyungi_document_server/app/workers/digest_worker.py
Hyungi Ahn 75a1919342 feat(digest): Phase 4 Global News Digest (cluster-level batch summarization)
7일 rolling window 뉴스를 country × topic 2-level로 묶어 매일 04:00 KST 배치 생성.
search 파이프라인 미사용. documents → clustering → cluster-level LLM summarization → digest.

핵심 결정:
- adaptive threshold (0.75/0.78/0.80) + EMA centroid (α=0.7) + time-decay (λ=ln(2)/3)
- min_articles=3, max_topics=10/country, top-5 MMR diversity, ai_summary[:300] truncate
- cluster-level LLM only, drop금지 fallback (topic_label="주요 뉴스 묶음" + top member ai_summary[:200])
- importance_score country별 0~1 normalize + raw_weight_sum 별도 보존, max(score, 0.01) floor
- per-call timeout 25s + pipeline hard cap 600s
- DELETE+INSERT idempotent (UNIQUE digest_date), AIClient._call_chat 직접 호출 (client.py 수정 없음)

신규:
- migrations/101_global_digests.sql (2테이블 정규화)
- app/models/digest.py (GlobalDigest + DigestTopic ORM)
- app/services/digest/{loader,clustering,selection,summarizer,pipeline}.py
- app/workers/digest_worker.py (PIPELINE_HARD_CAP + CLI 진입점)
- app/api/digest.py (/latest, ?date|country, /regenerate, inline Pydantic)
- app/prompts/digest_topic.txt (JSON-only + 절대 금지 블록)

main.py 4줄: import 2 + scheduler add_job 1 + include_router 1.
plan: ~/.claude/plans/quiet-herding-tome.md
2026-04-09 07:45:11 +09:00

45 lines
1.4 KiB
Python
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
"""Phase 4: Global Digest 워커.
7일 뉴스를 country × topic 으로 묶어 cluster-level LLM 요약을 생성하고
global_digests / digest_topics 테이블에 저장한다.
- APScheduler cron (매일 04:00 KST) + 수동 호출 공용 진입점
- PIPELINE_HARD_CAP = 600초 hard cap 으로 cron stuck 절대 방지
- 단독 실행: `python -m workers.digest_worker`
"""
import asyncio
from core.utils import setup_logger
from services.digest.pipeline import run_digest_pipeline
logger = setup_logger("digest_worker")
PIPELINE_HARD_CAP = 600 # 10분 hard cap
async def run() -> None:
"""APScheduler + 수동 호출 공용 진입점.
pipeline 자체는 timeout 으로 감싸지 않음 (per-call timeout 은 summarizer 가 처리).
여기서는 전체 hard cap 만 강제.
"""
try:
result = await asyncio.wait_for(
run_digest_pipeline(),
timeout=PIPELINE_HARD_CAP,
)
logger.info(f"[global_digest] 워커 완료: {result}")
except asyncio.TimeoutError:
logger.error(
f"[global_digest] HARD CAP {PIPELINE_HARD_CAP}s 초과 — 워커 강제 중단. "
f"기존 digest 는 commit 시점에만 갱신되므로 그대로 유지됨. "
f"다음 cron 실행에서 재시도."
)
except Exception as e:
logger.exception(f"[global_digest] 워커 실패: {e}")
if __name__ == "__main__":
asyncio.run(run())