diff --git a/DOCKER-GUIDE.md b/DOCKER-GUIDE.md
new file mode 100644
index 0000000..5d04679
--- /dev/null
+++ b/DOCKER-GUIDE.md
@@ -0,0 +1,193 @@
+# TK-MP-Project Docker ๊ฐ์ด๋
+
+## ๐ ๋น ๋ฅธ ์์
+
+### 1. ๊ฐ๋ฐ ํ๊ฒฝ ์คํ
+```bash
+./docker-run.sh dev up
+```
+
+### 2. ํ๋ก๋์
ํ๊ฒฝ ์คํ
+```bash
+./docker-run.sh prod up
+```
+
+### 3. ์๋๋ก์ง NAS ํ๊ฒฝ ์คํ
+```bash
+./docker-run.sh synology up
+```
+
+## ๐ ์ฌ์ฉ ๊ฐ๋ฅํ ๋ช
๋ น์ด
+
+| ๋ช
๋ น์ด | ์ค๋ช
|
+|--------|------|
+| `up` | ์ปจํ
์ด๋ ์์ (๊ธฐ๋ณธ๊ฐ) |
+| `down` | ์ปจํ
์ด๋ ์ค์ง |
+| `build` | ์ด๋ฏธ์ง ๋น๋ |
+| `rebuild` | ์ด๋ฏธ์ง ์ฌ๋น๋ (์บ์ ๋ฌด์) |
+| `logs` | ๋ก๊ทธ ์ค์๊ฐ ํ์ธ |
+| `ps` ๋๋ `status` | ์๋น์ค ์ํ ํ์ธ |
+| `restart` | ์ปจํ
์ด๋ ์ฌ์์ |
+
+## ๐ ํ๊ฒฝ๋ณ ์ค์
+
+### ๊ฐ๋ฐ ํ๊ฒฝ (dev)
+- **ํฌํธ**: ๋ชจ๋ ์๋น์ค ์ธ๋ถ ๋
ธ์ถ
+ - Frontend: http://localhost:13000
+ - Backend API: http://localhost:18000
+ - PostgreSQL: localhost:5432
+ - Redis: localhost:6379
+ - pgAdmin: http://localhost:5050
+- **ํน์ง**:
+ - ์ฝ๋ ์ค์๊ฐ ๋ฐ์ (Hot Reload)
+ - ๋๋ฒ๊ทธ ๋ชจ๋ ํ์ฑํ
+ - ๋ชจ๋ ๋ก๊ทธ ๋ ๋ฒจ ์ถ๋ ฅ
+
+### ํ๋ก๋์
ํ๊ฒฝ (prod)
+- **ํฌํธ**: Nginx๋ฅผ ํตํ ๋ฆฌ๋ฒ์ค ํ๋ก์
+ - Web: http://localhost (Nginx)
+ - HTTPS: https://localhost (SSL ์ค์ ํ์)
+- **ํน์ง**:
+ - ๋ด๋ถ ์๋น์ค ํฌํธ ๋น๋
ธ์ถ
+ - ์ต์ ํ๋ ๋น๋
+ - ๋ก๊ทธ ๋ ๋ฒจ INFO
+ - pgAdmin ๋นํ์ฑํ
+
+### ์๋๋ก์ง NAS ํ๊ฒฝ (synology)
+- **ํฌํธ**: ํฌํธ ์ถฉ๋ ๋ฐฉ์ง๋ฅผ ์ํ ์ปค์คํ
ํฌํธ
+ - Frontend: http://localhost:10173
+ - Backend API: http://localhost:10080
+ - PostgreSQL: localhost:15432
+ - Redis: localhost:16379
+ - pgAdmin: http://localhost:15050
+- **ํน์ง**:
+ - ๋ช
๋ช
๋ ๋ณผ๋ฅจ ์ฌ์ฉ
+ - ์๋๋ก์ง Container Manager ํธํ
+
+## ๐ง ํ๊ฒฝ ์ค์ ํ์ผ
+
+๊ฐ ํ๊ฒฝ๋ณ ์ค์ ์ ๋ค์ ํ์ผ์์ ๊ด๋ฆฌ๋ฉ๋๋ค:
+
+- `env.development` - ๊ฐ๋ฐ ํ๊ฒฝ ์ค์
+- `env.production` - ํ๋ก๋์
ํ๊ฒฝ ์ค์
+- `env.synology` - ์๋๋ก์ง ํ๊ฒฝ ์ค์
+
+### ์ฃผ์ ํ๊ฒฝ ๋ณ์
+
+```bash
+# ๋ฐฐํฌ ํ๊ฒฝ
+DEPLOY_ENV=development|production|synology
+
+# ํฌํธ ์ค์
+FRONTEND_EXTERNAL_PORT=13000
+BACKEND_EXTERNAL_PORT=18000
+POSTGRES_EXTERNAL_PORT=5432
+
+# ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ค์
+POSTGRES_DB=tk_mp_bom
+POSTGRES_USER=tkmp_user
+POSTGRES_PASSWORD=tkmp_password_2025
+
+# ๋๋ฒ๊ทธ ์ค์
+DEBUG=true|false
+LOG_LEVEL=DEBUG|INFO|WARNING|ERROR
+```
+
+## ๐ ๏ธ ์ฌ์ฉ ์์
+
+### ๊ฐ๋ฐ ์์
+```bash
+# ๊ฐ๋ฐ ํ๊ฒฝ ์์
+./docker-run.sh dev up
+
+# ๋ก๊ทธ ํ์ธ
+./docker-run.sh dev logs
+
+# ์ํ ํ์ธ
+./docker-run.sh dev ps
+```
+
+### ํ๋ก๋์
๋ฐฐํฌ
+```bash
+# ์ด๋ฏธ์ง ๋น๋
+./docker-run.sh prod build
+
+# ํ๋ก๋์
์์
+./docker-run.sh prod up
+
+# ์ํ ํ์ธ
+./docker-run.sh prod ps
+```
+
+### ์๋๋ก์ง NAS ๋ฐฐํฌ
+```bash
+# ์๋๋ก์ง ํ๊ฒฝ ์์
+./docker-run.sh synology up
+
+# ๋ก๊ทธ ํ์ธ
+./docker-run.sh synology logs
+```
+
+### ์ปจํ
์ด๋ ๊ด๋ฆฌ
+```bash
+# ์ปจํ
์ด๋ ์ค์ง
+./docker-run.sh dev down
+
+# ์ปจํ
์ด๋ ์ฌ์์
+./docker-run.sh dev restart
+
+# ์ด๋ฏธ์ง ์ฌ๋น๋ (์บ์ ๋ฌด์)
+./docker-run.sh dev rebuild
+```
+
+## ๐ ํธ๋ฌ๋ธ์ํ
+
+### ํฌํธ ์ถฉ๋ ํด๊ฒฐ
+ํ๊ฒฝ ์ค์ ํ์ผ์์ `*_EXTERNAL_PORT` ๋ณ์๋ฅผ ์์ ํ์ธ์.
+
+### ๋ณผ๋ฅจ ๊ถํ ๋ฌธ์
+```bash
+# ๋ณผ๋ฅจ ์ญ์ ํ ์ฌ์์ฑ
+docker volume prune
+./docker-run.sh dev up
+```
+
+### ์ด๋ฏธ์ง ๋น๋ ๋ฌธ์
+```bash
+# ์บ์ ์์ด ์ฌ๋น๋
+./docker-run.sh dev rebuild
+```
+
+## ๐ ํ์ผ ๊ตฌ์กฐ
+
+```
+TK-MP-Project/
+โโโ docker-compose.yml # ํตํฉ Docker Compose ํ์ผ
+โโโ docker-run.sh # ์คํ ์คํฌ๋ฆฝํธ
+โโโ env.development # ๊ฐ๋ฐ ํ๊ฒฝ ์ค์
+โโโ env.production # ํ๋ก๋์
ํ๊ฒฝ ์ค์
+โโโ env.synology # ์๋๋ก์ง ํ๊ฒฝ ์ค์
+โโโ docker-backup/ # ๊ธฐ์กด ํ์ผ ๋ฐฑ์
+โ โโโ docker-compose.yml
+โ โโโ docker-compose.prod.yml
+โ โโโ docker-compose.synology.yml
+โ โโโ docker-compose.override.yml
+โโโ DOCKER-GUIDE.md # ์ด ๊ฐ์ด๋ ํ์ผ
+```
+
+## ๐ฏ ๋ง์ด๊ทธ๋ ์ด์
๊ฐ์ด๋
+
+๊ธฐ์กด Docker Compose ํ์ผ์ ์ฌ์ฉํ๋ ๊ฒฝ์ฐ:
+
+1. **๊ธฐ์กด ์ปจํ
์ด๋ ์ค์ง**
+ ```bash
+ docker-compose down
+ ```
+
+2. **์๋ก์ด ๋ฐฉ์์ผ๋ก ์์**
+ ```bash
+ ./docker-run.sh dev up
+ ```
+
+3. **๊ธฐ์กด ํ์ผ์ `docker-backup/` ํด๋์ ๋ณด๊ด๋จ**
+
diff --git a/RULES.md b/RULES.md
index 6e55717..a59f838 100644
--- a/RULES.md
+++ b/RULES.md
@@ -1,4 +1,4 @@
-# ๐ TK-MP-Project: ํตํฉ ํ๋ก์ ํธ ๋ฌธ์
+# ๐ TK-MP-Project: ํตํฉ ํ๋ก์ ํธ ๋ฌธ์์กด
> **์ต์ข
์
๋ฐ์ดํธ**: 2025๋
1์ (ํตํฉ ๋ฌธ์ ์์ฑ)
@@ -59,11 +59,15 @@ frontend/src/
โโโ api.js # API ํด๋ผ์ด์ธํธ (Axios ์ค์ )
โโโ components/ # ์ฌ์ฌ์ฉ ๊ฐ๋ฅํ ์ปดํฌ๋ํธ
โ โโโ NavigationMenu.jsx # ์ฌ์ด๋๋ฐ ๋ค๋น๊ฒ์ด์
(๊ถํ ๊ธฐ๋ฐ)
+โ โโโ PersonalizedDashboard.jsx # ๊ฐ์ธํ๋ ๋์๋ณด๋ (์ญํ ๋ณ ๋ง์ถค)
+โ โโโ ProjectSelector.jsx # ํ๋ก์ ํธ ์ ํ ๋๋กญ๋ค์ด (๊ฒ์ ๊ธฐ๋ฅ)
โ โโโ BOMFileUpload.jsx # BOM ํ์ผ ์
๋ก๋ ํผ
โ โโโ BOMFileTable.jsx # BOM ํ์ผ ๋ชฉ๋ก ํ
์ด๋ธ
โ โโโ RevisionUploadDialog.jsx # ๋ฆฌ๋น์ ์
๋ก๋ ๋ค์ด์ผ๋ก๊ทธ
โโโ pages/ # ํ์ด์ง ์ปดํฌ๋ํธ
- โโโ DashboardPage.jsx # ๋์๋ณด๋
+ โโโ DashboardPage.jsx # ๋์๋ณด๋ (๊ธฐ์กด)
+ โโโ ProjectWorkspacePage.jsx # ํ๋ก์ ํธ๋ณ ์ํฌ์คํ์ด์ค (์ ๊ท)
+ โโโ BOMUploadPage.jsx # BOM ์
๋ก๋ ํ์ด์ง (์ ๊ท)
โโโ ProjectsPage.jsx # ํ๋ก์ ํธ ๊ด๋ฆฌ
โโโ JobSelectionPage.jsx # ํ๋ก์ ํธ ์ ํ
โโโ BOMStatusPage.jsx # BOM ๊ด๋ฆฌ ๋ฉ์ธ
@@ -176,37 +180,401 @@ graph TD
| `RevisionPurchasePage.jsx` | 300์ค+ | ๊ตฌ๋งค ๋ก์ง ๋ถ๋ฆฌ | ๋ฎ์ |
| `auth_service.py` | 300์ค+ | ๊ธฐ๋ฅ๋ณ ์๋น์ค ๋ถ๋ฆฌ | ๋์ |
-### ๐ API ์๋ํฌ์ธํธ ๋งต
+### ๐ **API ์๋ํฌ์ธํธ ์ ์ฒด ๋งต** (2025.01 ์ต์ )
-#### ์ธ์ฆ API (`/auth/`)
+> **์ค์**: ์๋ก์ด API ์ถ๊ฐ ์ ๋ฐ๋์ ์ด ์น์
์ ์
๋ฐ์ดํธํ๊ณ , ๋ฒ์ ๊ด๋ฆฌ ๋ฐ ํ์ ํธํ์ฑ์ ๊ณ ๋ คํด์ผ ํฉ๋๋ค.
+
+#### **๐ API ๋ฌธ์ํ ๊ท์น**
+1. **์ API ์ถ๊ฐ ์**: ์ด ๋ฌธ์์ ์ฆ์ ๋ฐ์
+2. **API ๋ณ๊ฒฝ ์**: ๋ณ๊ฒฝ ์ด๋ ฅ๊ณผ ๋ง์ด๊ทธ๋ ์ด์
๊ฐ์ด๋ ํฌํจ
+3. **๊ถํ ํ์**: ๊ฐ ์๋ํฌ์ธํธ๋ณ ํ์ ๊ถํ ๋ช
์
+4. **์๋ต ํ์**: ํ์ค ์๋ต ๊ตฌ์กฐ ์ค์
+
+#### **๐จ API ์ฌ์ฉ ๊ฐ์ด๋๋ผ์ธ (ํผ๋ ๋ฐฉ์ง)** โญ ์ค์
+
+##### **1. ์์ฌ ๊ด๋ จ API ํตํฉ ์ฌ์ฉ๋ฒ**
+```javascript
+// โ
์ฌ๋ฐ๋ฅธ ์ฌ์ฉ๋ฒ - ํตํฉ๋ API ์ฌ์ฉ
+import { fetchMaterials } from '../api';
+
+// ํ์ผ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({ file_id: 123, limit: 1000 });
+
+// ํ๋ก์ ํธ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({ job_no: 'J24-001', limit: 1000 });
+
+// ๋ฆฌ๋น์ ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({
+ job_no: 'J24-001',
+ revision: 'Rev.1',
+ limit: 1000
+});
+
+// โ ์๋ชป๋ ์ฌ์ฉ๋ฒ - ์ง์ API ํธ์ถ ๊ธ์ง
+const response = await api.get('/files/materials-v2', { params }); // ๊ธ์ง
+const response = await api.get('/files/materials', { params }); // ์กด์ฌํ์ง ์์
```
-POST /auth/login # ๋ก๊ทธ์ธ
-POST /auth/register # ์ฌ์ฉ์ ๋ฑ๋ก
-POST /auth/refresh # ํ ํฐ ๊ฐฑ์
-POST /auth/logout # ๋ก๊ทธ์์
-GET /auth/me # ํ์ฌ ์ฌ์ฉ์ ์ ๋ณด
-GET /auth/verify # ํ ํฐ ๊ฒ์ฆ
+
+##### **2. API ํจ์ vs ์ง์ ํธ์ถ ๊ท์น**
+```javascript
+// โ
๊ถ์ฅ: api.js์ ๋ํผ ํจ์ ์ฌ์ฉ
+import { fetchMaterials, fetchFiles, fetchJobs } from '../api';
+
+// โ ๋น๊ถ์ฅ: ์ง์ API ํธ์ถ (ํน๋ณํ ๊ฒฝ์ฐ์๋ง)
+const response = await api.get('/files/materials-v2');
+```
+
+##### **3. ๋ฐฑ์๋ API ์๋ํฌ์ธํธ ๋ช
๋ช
๊ท์น**
+- **๊ธฐ๋ณธ ํํ**: `/{๋ชจ๋}/{๋ฆฌ์์ค}`
+- **๋ฒ์ ๊ด๋ฆฌ**: `/{๋ชจ๋}/{๋ฆฌ์์ค}-v2` (ํ์ ํธํ์ฑ ์ ์ง)
+- **์ก์
๊ธฐ๋ฐ**: `/{๋ชจ๋}/{๋ฆฌ์์ค}/{์ก์
}`
+
+**์์:**
+```
+/files/materials-v2 # ์์ฌ ๋ชฉ๋ก (์ต์ ๋ฒ์ )
+/files/materials/summary # ์์ฌ ์์ฝ ํต๊ณ
+/files/materials/compare-revisions # ๋ฆฌ๋น์ ๋น๊ต
+/purchase/items/calculate # ๊ตฌ๋งค ์๋ ๊ณ์ฐ
+/materials/compare-revisions # ์์ฌ ๋น๊ต (๋ณ๋ ๋ชจ๋)
+```
+
+##### **4. ํ๋ก ํธ์๋ API ํธ์ถ ํ์คํ**
+```javascript
+// api.js - ๋ชจ๋ API ํจ์๋ ์ฌ๊ธฐ์ ์ ์
+export function fetchMaterials(params) {
+ return api.get('/files/materials-v2', { params });
+}
+
+export function fetchFiles(params) {
+ return api.get('/files', { params });
+}
+
+export function fetchJobs(params) {
+ return api.get('/jobs/', { params });
+}
+
+// ์ปดํฌ๋ํธ์์ ์ฌ์ฉ
+import { fetchMaterials } from '../api';
+const response = await fetchMaterials({ job_no: 'J24-001' });
+```
+
+---
+
+#### **๐ ์ธ์ฆ API (`/auth/`)**
+```http
+POST /auth/login # ๋ก๊ทธ์ธ (๊ณต๊ฐ)
+POST /auth/register # ์ฌ์ฉ์ ๋ฑ๋ก (๊ด๋ฆฌ์)
+POST /auth/refresh # ํ ํฐ ๊ฐฑ์ (์ธ์ฆ ํ์)
+POST /auth/logout # ๋ก๊ทธ์์ (์ธ์ฆ ํ์)
+GET /auth/me # ํ์ฌ ์ฌ์ฉ์ ์ ๋ณด (์ธ์ฆ ํ์)
+GET /auth/verify # ํ ํฐ ๊ฒ์ฆ (์ธ์ฆ ํ์)
GET /auth/users # ์ฌ์ฉ์ ๋ชฉ๋ก (๊ด๋ฆฌ์)
PUT /auth/users/{id} # ์ฌ์ฉ์ ์์ (๊ด๋ฆฌ์)
+DELETE /auth/users/{id} # ์ฌ์ฉ์ ์ญ์ (๊ด๋ฆฌ์)
```
-#### ํ๋ก์ ํธ API (`/jobs/`)
-```
-GET /jobs/ # ํ๋ก์ ํธ ๋ชฉ๋ก
-POST /jobs/ # ํ๋ก์ ํธ ์์ฑ
-PUT /jobs/{id} # ํ๋ก์ ํธ ์์
-DELETE /jobs/{id} # ํ๋ก์ ํธ ์ญ์
+#### **๐ ํ๋ก์ ํธ ๊ด๋ฆฌ API (`/jobs/`)**
+```http
+GET /jobs/ # ํ๋ก์ ํธ ๋ชฉ๋ก (์ฌ์ฉ์)
+POST /jobs/ # ํ๋ก์ ํธ ์์ฑ (๋งค๋์ +)
+GET /jobs/{id} # ํ๋ก์ ํธ ์์ธ (์ฌ์ฉ์)
+PUT /jobs/{id} # ํ๋ก์ ํธ ์์ (๋งค๋์ +)
+DELETE /jobs/{id} # ํ๋ก์ ํธ ์ญ์ (๊ด๋ฆฌ์)
+GET /jobs/stats # ํ๋ก์ ํธ ํต๊ณ (๋งค๋์ +)
+POST /jobs/{id}/assign # ๋ด๋น์ ํ ๋น (๋งค๋์ +)
```
-#### ํ์ผ/์์ฌ API (`/files/`)
+**์ค์ ์๋ต ๊ตฌ์กฐ:**
+```json
+// GET /jobs/ - ํ๋ก์ ํธ ๋ชฉ๋ก
+{
+ "success": true,
+ "total_count": 2,
+ "jobs": [
+ {
+ "job_no": "J24-001",
+ "job_name": "์ธ์ฐ SK์๋์ง ์ ์ ์์ค ์ฆ์ค ๋ฐฐ๊ด๊ณต์ฌ",
+ "project_name": "์ธ์ฐ SK์๋์ง ์ ์ ์์ค ์ฆ์ค ๋ฐฐ๊ด๊ณต์ฌ",
+ "client_name": "์ผ์ฑ์์ง๋์ด๋ง",
+ "end_user": "SK์๋์ง",
+ "epc_company": "์ผ์ฑ์์ง๋์ด๋ง",
+ "project_site": "์ธ์ฐ๊ด์ญ์ ์จ์ฐ๊ณต๋จ",
+ "contract_date": "2024-03-15",
+ "delivery_date": "2024-08-30",
+ "delivery_terms": "FOB ์ธ์ฐํญ",
+ "project_type": "๋๋๊ธฐ",
+ "status": "์งํ์ค",
+ "description": "์ ์ ์์ค ์ฆ์ค์ ์ํ ๋ฐฐ๊ด ์์ฌ ๊ณต๊ธ",
+ "created_at": "2025-07-15T03:44:46.035325"
+ }
+ ]
+}
```
-GET /files # ํ์ผ ๋ชฉ๋ก (job_no ํํฐ)
-POST /files/upload # ํ์ผ ์
๋ก๋
-DELETE /files/{id} # ํ์ผ ์ญ์
-GET /files/stats # ํ์ผ/์์ฌ ํต๊ณ
-GET /files/materials # ์์ฌ ๋ชฉ๋ก (file_id ํํฐ)
+
+#### **๐ ํ์ผ/์์ฌ ๊ด๋ฆฌ API (`/files/`)**
+```http
+GET /files # ํ์ผ ๋ชฉ๋ก (์ฌ์ฉ์)
+POST /files/upload # ํ์ผ ์
๋ก๋ (์ค๊ณ์+) โญ ์ฌ์ฉ์ ์ถ์
+DELETE /files/delete/{file_id} # ํ์ผ ์ญ์ (์ค๊ณ์+)
+GET /files/stats # ํ์ผ/์์ฌ ํต๊ณ (์ฌ์ฉ์)
+GET /files/materials-v2 # ์์ฌ ๋ชฉ๋ก (์ฌ์ฉ์) โญ ์ต์ ๋ฒ์
+GET /files/materials/summary # ์์ฌ ์์ฝ ํต๊ณ (์ฌ์ฉ์)
+GET /files/materials/compare-revisions # ๋ฆฌ๋น์ ๋น๊ต (์ฌ์ฉ์)
+GET /files/pipe-details # ํ์ดํ ์์ธ ์ ๋ณด (์ฌ์ฉ์)
+GET /files/fitting-details # ํผํ
์์ธ ์ ๋ณด (์ฌ์ฉ์)
+GET /files/valve-details # ๋ฐธ๋ธ ์์ธ ์ ๋ณด (์ฌ์ฉ์)
+POST /files/user-requirements # ์ฌ์ฉ์ ์๊ตฌ์ฌํญ ์์ฑ (์ฌ์ฉ์)
+GET /files/user-requirements # ์ฌ์ฉ์ ์๊ตฌ์ฌํญ ์กฐํ (์ฌ์ฉ์)
+POST /files/materials/{id}/verify # ์์ฌ ๋ถ๋ฅ ๊ฒ์ฆ (์ค๊ณ์+)
+PUT /files/materials/{id}/update-classification # ์์ฌ ๋ถ๋ฅ ์์ (์ค๊ณ์+)
+POST /files/materials/confirm-purchase # ์์ฌ ๊ตฌ๋งค ํ์ (๊ตฌ๋งค์+)
```
+**โ ๏ธ ์ค์: ์์ฌ API ์ฌ์ฉ ์ ์ฃผ์์ฌํญ**
+- โ
**์ฌ์ฉ**: `/files/materials-v2` (์ต์ ๋ฒ์ , ๋ชจ๋ ๊ธฐ๋ฅ ์ง์)
+- โ **์ฌ์ฉ ๊ธ์ง**: `/files/materials` (์กด์ฌํ์ง ์์, 404 ์ค๋ฅ ๋ฐ์)
+- ๐ **๋ง์ด๊ทธ๋ ์ด์
**: ๋ชจ๋ ์ปดํฌ๋ํธ์์ `fetchMaterials()` ํจ์ ์ฌ์ฉ ๊ถ์ฅ
+
+#### **๐ง ์์ฌ ๋ถ๋ฅ/๋น๊ต API (`/materials/`)**
+```http
+POST /materials/compare-revisions # ๋ฆฌ๋น์ ๋น๊ต (์ค๊ณ์+) โญ ์ฌ์ฉ์ ์ถ์
+GET /materials/comparison-history # ๋น๊ต ์ด๋ ฅ ์กฐํ (์ฌ์ฉ์)
+GET /materials/inventory-status # ์ฌ๊ณ ํํฉ (๊ตฌ๋งค์+)
+POST /materials/confirm-purchase # ๊ตฌ๋งค ํ์ (๊ตฌ๋งค์+) โญ ์ฌ์ฉ์ ์ถ์
+GET /materials/purchase-status # ๊ตฌ๋งค ์ํ (๊ตฌ๋งค์+)
+```
+
+#### **๐ ๊ตฌ๋งค ๊ด๋ฆฌ API (`/purchase/`)**
+```http
+GET /purchase/items/calculate # ๊ตฌ๋งค ํ๋ชฉ ๊ณ์ฐ (๊ตฌ๋งค์+)
+POST /purchase/confirm # ๊ตฌ๋งค ์๋ ํ์ (๊ตฌ๋งค์+) โญ ์ฌ์ฉ์ ์ถ์
+POST /purchase/items/save # ๊ตฌ๋งค ํ๋ชฉ ์ ์ฅ (๊ตฌ๋งค์+)
+GET /purchase/items # ๊ตฌ๋งค ํ๋ชฉ ๋ชฉ๋ก (๊ตฌ๋งค์+)
+GET /purchase/revision-diff # ๋ฆฌ๋น์ ์ฐจ์ด (๊ตฌ๋งค์+)
+POST /purchase/orders/create # ๊ตฌ๋งค ์ฃผ๋ฌธ ์์ฑ (๊ตฌ๋งค์+) โญ ์ฌ์ฉ์ ์ถ์
+GET /purchase/orders # ๊ตฌ๋งค ์ฃผ๋ฌธ ๋ชฉ๋ก (๊ตฌ๋งค์+)
+```
+
+#### **๐ ๋์๋ณด๋ API (`/dashboard/`)** โญ ์ ๊ท (2025.01)
+```http
+GET /dashboard/stats # ์ฌ์ฉ์๋ณ ๋ง์ถค ํต๊ณ (์ธ์ฆ ํ์)
+GET /dashboard/activities # ์ฌ์ฉ์ ํ๋ ์ด๋ ฅ (์ธ์ฆ ํ์)
+GET /dashboard/recent-activities # ์ ์ฒด ์ต๊ทผ ํ๋ (๋งค๋์ +)
+GET /dashboard/quick-actions # ์ญํ ๋ณ ๋น ๋ฅธ ์์
(์ธ์ฆ ํ์)
+```
+
+**์ค์ ์๋ต ๊ตฌ์กฐ:**
+```json
+// GET /dashboard/stats - ์ฌ์ฉ์๋ณ ๋ง์ถค ํต๊ณ
+{
+ "success": true,
+ "user_role": "admin",
+ "stats": {
+ "total_projects": 45,
+ "active_users": 12,
+ "system_status": "์ ์",
+ "today_uploads": 8
+ // ์ฃผ์: quickActions, metrics ๋ฑ์ ํ๋ก ํธ์๋์์ ๋ชฉ ๋ฐ์ดํฐ๋ก ๋ณด์๋จ
+ }
+}
+
+// GET /dashboard/activities - ์ฌ์ฉ์ ํ๋ ์ด๋ ฅ
+{
+ "success": true,
+ "activities": [
+ {
+ "id": 1,
+ "activity_type": "FILE_UPLOAD",
+ "activity_description": "ํ์ผ ์
๋ก๋: ProjectX_Rev0.xlsx",
+ "created_at": "2025-08-30T08:30:00Z",
+ "target_id": 123,
+ "target_type": "FILE"
+ }
+ ]
+}
+```
+
+#### **๐ง ํ๋น ์์คํ
API (`/tubing/`)**
+```http
+GET /tubing/categories # ํ๋น ์นดํ
๊ณ ๋ฆฌ (์ฌ์ฉ์)
+GET /tubing/manufacturers # ์ ์กฐ์ฌ ๋ชฉ๋ก (์ฌ์ฉ์)
+GET /tubing/specifications # ์ฌ์ ๋ชฉ๋ก (์ฌ์ฉ์)
+GET /tubing/products # ํ๋น ์ ํ ๋ชฉ๋ก (์ฌ์ฉ์)
+POST /tubing/products # ํ๋น ์ ํ ์์ฑ (์ค๊ณ์+)
+POST /tubing/material-mapping # ์์ฌ-ํ๋น ๋งคํ ์์ฑ (์ค๊ณ์+)
+GET /tubing/material-mappings/{material_id} # ์์ฌ๋ณ ํ๋น ๋งคํ ์กฐํ (์ฌ์ฉ์)
+GET /tubing/search # ํ๋น ์ ํ ๊ฒ์ (์ฌ์ฉ์)
+```
+
+---
+
+### ๐ **API ๊ฐ๋ฐ ๊ฐ์ด๋๋ผ์ธ** (2025.01 ์ ๊ท)
+
+#### **1. ์ API ๋ชจ๋ ์ถ๊ฐ ์ ์ฐจ**
+```python
+# 1. ๋ผ์ฐํฐ ํ์ผ ์์ฑ
+# backend/app/routers/new_module.py
+
+from fastapi import APIRouter, Depends, HTTPException
+from ..auth.middleware import get_current_user
+from ..services.activity_logger import log_activity_from_request
+
+router = APIRouter(prefix="/new-module", tags=["new-module"])
+
+@router.post("/action")
+async def new_action(
+ request: Request,
+ current_user: dict = Depends(get_current_user),
+ db: Session = Depends(get_db)
+):
+ # ์ฌ์ฉ์ ์ถ์ ํ์
+ log_activity_from_request(
+ db, request, current_user['username'],
+ "NEW_ACTION", "์ ์ก์
์คํ"
+ )
+ # ๋น์ฆ๋์ค ๋ก์ง...
+```
+
+```python
+# 2. main.py์ ๋ผ์ฐํฐ ๋ฑ๋ก
+try:
+ from .routers import new_module
+ app.include_router(new_module.router, tags=["new-module"])
+except ImportError:
+ logger.warning("new_module ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
+```
+
+```markdown
+# 3. RULES.md ์
๋ฐ์ดํธ (์ด ๋ฌธ์)
+#### **๐ ์ ๋ชจ๋ API (`/new-module/`)**
+GET /new-module/list # ๋ชฉ๋ก ์กฐํ (์ฌ์ฉ์)
+POST /new-module/action # ์ก์
์คํ (๊ถํ) โญ ์ฌ์ฉ์ ์ถ์
+```
+
+#### **2. API ์๋ต ํ์ค ํ์**
+```json
+// ์ฑ๊ณต ์๋ต
+{
+ "success": true,
+ "message": "์์
์ด ์๋ฃ๋์์ต๋๋ค",
+ "data": { ... },
+ "timestamp": "2025-01-XX 12:00:00"
+}
+
+// ์๋ฌ ์๋ต
+{
+ "success": false,
+ "error": "์๋ฌ ๋ฉ์์ง",
+ "error_code": "ERROR_CODE",
+ "detail": "์์ธ ์๋ฌ ์ ๋ณด"
+}
+```
+
+#### **2-1. ์ค์ ์๋ต ๊ตฌ์กฐ ๋ฌธ์ํ ๊ท์น** โญ ์ค์
+- **๋ชจ๋ API๋ ์ค์ ์๋ต ๊ตฌ์กฐ๋ฅผ RULES.md์ ๋ช
์ ํ์**
+- **ํ๋ก ํธ์๋ ๊ฐ๋ฐ ์ ์ฐธ์กฐํ ์ ์๋๋ก JSON ์์ ํฌํจ**
+- **ํ๋๋ช
, ๋ฐ์ดํฐ ํ์
, ์ค์ฒฉ ๊ตฌ์กฐ ๋ชจ๋ ์ ํํ ๊ธฐ๋ก**
+- **๋ชฉ ๋ฐ์ดํฐ ์ฌ์ฉ ์ ์ฃผ์์ผ๋ก ๋ช
์**
+- **API ๋ณ๊ฒฝ ์ ๋ฌธ์๋ ์ฆ์ ์
๋ฐ์ดํธ**
+
+**๋ฌธ์ํ ์์:**
+```json
+// GET /jobs/ - ํ๋ก์ ํธ ๋ชฉ๋ก (์ค์ ์๋ต)
+{
+ "success": true,
+ "total_count": 2,
+ "jobs": [
+ {
+ "job_no": "J24-001",
+ "project_name": "ํ๋ก์ ํธ๋ช
",
+ "status": "์งํ์ค",
+ "client_name": "๊ณ ๊ฐ์ฌ๋ช
"
+ // ... ๋ชจ๋ ํ๋ ๋ช
์
+ }
+ ]
+}
+```
+
+#### **3. ๊ถํ ๋ ๋ฒจ ์ ์**
+- **๊ณต๊ฐ**: ์ธ์ฆ ๋ถํ์
+- **์ฌ์ฉ์**: ๋ก๊ทธ์ธํ ๋ชจ๋ ์ฌ์ฉ์
+- **์ค๊ณ์+**: designer, manager, admin
+- **๊ตฌ๋งค์+**: purchaser, manager, admin
+- **๋งค๋์ +**: manager, admin
+- **๊ด๋ฆฌ์**: admin๋ง
+
+#### **4. ์ฌ์ฉ์ ์ถ์ ํ์ API** โญ
+๋ค์ ์์
์ ๋ฐ๋์ ํ๋ ๋ก๊ทธ๋ฅผ ๊ธฐ๋กํด์ผ ํจ:
+- ํ์ผ ์
๋ก๋/์ญ์
+- ํ๋ก์ ํธ ์์ฑ/์์ /์ญ์
+- ๊ตฌ๋งค ํ์ /์ฃผ๋ฌธ ์์ฑ
+- ์์ฌ ๋ถ๋ฅ/๊ฒ์ฆ
+- ์์คํ
์ค์ ๋ณ๊ฒฝ
+
+#### **5. API ๋ฒ์ ๊ด๋ฆฌ**
+```http
+# ํ์ฌ ๋ฒ์ (๊ธฐ๋ณธ)
+GET /files/upload
+
+# ์ ๋ฒ์ (ํ์ ํธํ์ฑ ์ ์ง)
+GET /v2/files/upload
+
+# ํค๋ ๊ธฐ๋ฐ ๋ฒ์ ๊ด๋ฆฌ
+GET /files/upload
+Accept: application/vnd.tkmp.v2+json
+```
+
+#### **6. ์ฑ๋ฅ ๊ณ ๋ ค์ฌํญ**
+- **ํ์ด์ง๋ค์ด์
**: ๋ชฉ๋ก API๋ limit/offset ์ง์
+- **ํํฐ๋ง**: ์ฟผ๋ฆฌ ํ๋ผ๋ฏธํฐ๋ก ํํฐ ์กฐ๊ฑด ์ ๊ณต
+- **์บ์ฑ**: ์์ฃผ ์กฐํ๋๋ ๋ฐ์ดํฐ๋ Redis ์บ์ฑ
+- **๋น๋๊ธฐ ์ฒ๋ฆฌ**: ๋์ฉ๋ ํ์ผ ์ฒ๋ฆฌ๋ ๋ฐฑ๊ทธ๋ผ์ด๋ ์์
+
+---
+
+### ๐ **API ๋ณ๊ฒฝ ์ด๋ ฅ** (2025.01)
+
+#### **v2.2.0 (2025.09.05)** โญ ์ต์
+- โ
**์ ๋ฆฌ**: API ์๋ํฌ์ธํธ ํ์คํ ๋ฐ ํตํฉ
+- โ
**๋ฌธ์ํ**: ์ ์ฒด API ๋งต ์
๋ฐ์ดํธ (์ค์ ๊ตฌํ ๊ธฐ์ค)
+- โ
**๊ฐ์ **: ํ๋ก ํธ์๋ API ํธ์ถ ํ์คํ (`fetchMaterials` ํจ์ ์ฌ์ฉ)
+- โ
**์์ **: `/files/materials` โ `/files/materials-v2` ๋ง์ด๊ทธ๋ ์ด์
์๋ฃ
+- โ
**์ถ๊ฐ**: API ์ฌ์ฉ ๊ฐ์ด๋๋ผ์ธ ๋ฐ ํผ๋ ๋ฐฉ์ง ๊ท์น
+- โ
**์ถ๊ฐ**: ํ๋น ์์คํ
API ๋ฌธ์ํ
+
+#### **v2.1.1 (2025.08.30)**
+- โ
**๋ฌธ์ํ**: `/jobs/` API ์ค์ ์๋ต ๊ตฌ์กฐ ๋ช
์
+- โ
**๋ฌธ์ํ**: `/dashboard/` API ์ค์ ์๋ต ๊ตฌ์กฐ ๋ช
์
+- โ
**๊ฐ์ **: ํ๋ก ํธ์๋-๋ฐฑ์๋ API ์๋ต ๊ตฌ์กฐ ๋ถ์ผ์น ํด๊ฒฐ
+- โ
**์ถ๊ฐ**: ์ค์ ์๋ต ๊ตฌ์กฐ ๋ฌธ์ํ ๊ท์น ๋ฐ ๊ฐ์ด๋๋ผ์ธ
+
+#### **v2.1.0 (2025.01.XX)**
+- โ
**์ถ๊ฐ**: `/dashboard/` API ๋ชจ๋ (์ฌ์ฉ์๋ณ ๋ง์ถค ๋์๋ณด๋)
+- โ
**๊ฐ์ **: ๋ชจ๋ ์
๋ก๋/์์ API์ ์ฌ์ฉ์ ์ถ์ ์ถ๊ฐ
+- โ
**๋ณ๊ฒฝ**: `/files/upload` - `uploaded_by` ํ๋ ํ์ํ
+- โ ๏ธ **์ค๋จ ์์ **: `/old-endpoint` (v3.0์์ ์ ๊ฑฐ ์์ )
+
+#### **v2.0.0 (2025.01.XX)**
+- โ
**์ถ๊ฐ**: ์ธ์ฆ ์์คํ
(`/auth/`) ์์ ๊ตฌํ
+- โ
**์ถ๊ฐ**: ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ์์คํ
+- โ
**๋ณ๊ฒฝ**: ๋ชจ๋ API์ JWT ํ ํฐ ์ธ์ฆ ์ ์ฉ
+- ๐ **๋ง์ด๊ทธ๋ ์ด์
**: ๊ธฐ์กด API ํธ์ถ ์ Authorization ํค๋ ํ์
+
+---
+
+### ๐ **๊ฐ๋ฐ์ ์ฒดํฌ๋ฆฌ์คํธ**
+
+์ API ๊ฐ๋ฐ ์ ๋ค์ ์ฌํญ์ ํ์ธ:
+
+- [ ] **๋ฌธ์ํ**: RULES.md API ๋งต์ ์ถ๊ฐ
+- [ ] **์ธ์ฆ**: ์ ์ ํ ๊ถํ ๋ ๋ฒจ ์ค์
+- [ ] **์ถ์ **: ์ค์ ์์
์ ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+- [ ] **๊ฒ์ฆ**: ์
๋ ฅ ๋ฐ์ดํฐ ๊ฒ์ฆ (Pydantic)
+- [ ] **์๋ฌ**: ํ์ค ์๋ฌ ์๋ต ํ์ ์ค์
+- [ ] **ํ
์คํธ**: ๋จ์/ํตํฉ ํ
์คํธ ์์ฑ
+- [ ] **๋ก๊น
**: ์ ์ ํ ๋ก๊ทธ ๋ ๋ฒจ๋ก ๊ธฐ๋ก
+- [ ] **์ฑ๋ฅ**: ๋์ฉ๋ ๋ฐ์ดํฐ ์ฒ๋ฆฌ ๊ณ ๋ ค
+
### ๐ ๋ฐ์ดํฐ ํ๋ฆ๋
```mermaid
@@ -392,11 +760,74 @@ const totalLength = quantity * unitLength; // ์ด ๊ธธ์ด = ์๋ ร ๋จ์๊ธธ
material_hash = hashlib.md5(f"{description}|{size_spec}|{material_grade}".encode()).hexdigest()
```
-### 4. ๋ฆฌ๋น์ ๋น๊ต ๋ก์ง
+### 4. ๋ฆฌ๋น์ ๋น๊ต ๋ก์ง (2025.01 ์ ๊ท โญ)
```python
# ์ด์ ๋ฆฌ๋น์ ์๋ ํ์ง: ์ซ์ ๊ธฐ๋ฐ ๋น๊ต
current_rev_num = int(current_revision.replace("Rev.", ""))
# Rev.0 โ Rev.1 โ Rev.2 ์์
+
+# ์์ฌ ํด์ฑ ๊ท์น (RULES ์ค์)
+material_hash = hashlib.md5(f"{description}|{size}|{material}".encode()).hexdigest()
+
+# ๋ฆฌ๋น์ ๋น๊ต ์ํฌํ๋ก์ฐ
+if revision != "Rev.0": # ๋ฆฌ๋น์ ์
๋ก๋์ธ ๊ฒฝ์ฐ๋ง
+ revision_comparison = get_revision_comparison(db, job_no, revision, materials_data)
+
+ if revision_comparison.get("has_previous_confirmation"):
+ # ๋ณ๊ฒฝ์์: ๊ธฐ์กด ๋ถ๋ฅ ๊ฒฐ๊ณผ ์ฌ์ฌ์ฉ (confidence = 1.0)
+ # ๋ณ๊ฒฝ๋จ + ์ ๊ท: ์ฌ๋ถ๋ฅ ํ์
+ materials_to_classify = changed_materials + new_materials
+ else:
+ # ์ด์ ํ์ ์๋ฃ ์์: ์ ์ฒด ๋ถ๋ฅ
+ materials_to_classify = all_materials
+```
+
+### 5. ๊ตฌ๋งค ์๋ ํ์ ์ํฌํ๋ก์ฐ (2025.01 ์ ๊ท โญ)
+```python
+# ํ์ ๋ฐ์ดํฐ ์ ์ฅ ๊ตฌ์กฐ
+purchase_confirmations (๋ง์คํฐ) โ confirmed_purchase_items (์์ธ)
+
+# ํ์ ์ ํ์ผ ์ํ ์
๋ฐ์ดํธ
+files.purchase_confirmed = TRUE
+files.confirmed_at = timestamp
+files.confirmed_by = username
+
+# ๋ฆฌ๋น์ ์
๋ก๋ ์ ์ต์ ํ
+- ํ์ ๋ ์๋ฃ ์์: ๋ณ๊ฒฝ๋ ์์ฌ๋ง ๋ถ๋ฅ (์ฑ๋ฅ ํฅ์)
+- ํ์ ๋ ์๋ฃ ์์: ์ ์ฒด ์์ฌ ๋ถ๋ฅ (๊ธฐ์กด ๋ฐฉ์)
+```
+
+### 6. ๋์ฉ๋ ๋ฐ์ดํฐ ์ฒ๋ฆฌ ๊ท์น (2025.01 ์ ๊ท โญ)
+```python
+# 413 ์ค๋ฅ ๋ฐฉ์ง: ์์ฒญ ๋ฐ์ดํฐ ์ต์ ํ
+# โ ์ ์ฒด ๋ฐ์ดํฐ ์ ์ก (์ฉ๋ ์ด๊ณผ)
+purchase_items: List[dict] # ๋ชจ๋ ํ๋ ํฌํจ
+
+# โ
ํ์ ํ๋๋ง ์ ์ก (์ฉ๋ ์ต์ ํ)
+class PurchaseItemMinimal(BaseModel):
+ item_code: str
+ category: str
+ specification: str
+ size: str = ""
+ material: str = ""
+ bom_quantity: float
+ calculated_qty: float
+ unit: str = "EA"
+ safety_factor: float = 1.0
+
+# ์๋ฒ ์์ฒญ ํฌ๊ธฐ ์ ํ ์ค์
+app.add_middleware(RequestSizeLimitMiddleware, max_request_size=100 * 1024 * 1024) # 100MB
+
+# Nginx ํ๋ก์ ์ค์ (์ค์!)
+server {
+ client_max_body_size 100M; # ์ ์ญ ์ค์
+
+ location /api/ {
+ proxy_pass http://backend:8000/;
+ client_max_body_size 100M; # API ๊ฒฝ๋ก๋ณ ์ค์
+ proxy_request_buffering off; # ๋์ฉ๋ ์์ฒญ ์ต์ ํ
+ }
+}
```
---
@@ -581,32 +1012,108 @@ navigate(`/bom-status?job_no=${jobNo}`);
navigate(`/material-comparison?job_no=${jobNo}&revision=${revision}`);
```
+### 4. ํ๋ก์ ํธ ์ค์ฌ ์ํฌํ๋ก์ฐ (2025.01 ์ ๊ท) โญ
+
+#### **๊ธฐ๋ณธ ์์น**
+- **ํ๋ก์ ํธ ์ฐ์ **: ์ฌ์ฉ์๋ ๋จผ์ ํ๋ก์ ํธ๋ฅผ ์ ํํ๊ณ , ๊ทธ ๋ค์์ ์
๋ฌด๋ฅผ ์ ํ
+- **์ปจํ
์คํธ ์ ์ง**: ์ ํ๋ ํ๋ก์ ํธ ์ ๋ณด๋ ๋ชจ๋ ํ์ ํ์ด์ง์์ ์ ์ง
+- **๊ถํ ๊ธฐ๋ฐ ๋ฉ๋ด**: ์ฌ์ฉ์ ์ญํ ์ ๋ฐ๋ผ ์ฌ์ฉ ๊ฐ๋ฅํ ์
๋ฌด๋ง ํ์
+
+#### **์ํฌํ๋ก์ฐ ๊ตฌ์กฐ**
+```
+๋ฉ์ธ ๋์๋ณด๋ โ ํ๋ก์ ํธ ์ ํ โ ํ๋ก์ ํธ ์ํฌ์คํ์ด์ค โ ์
๋ฌด ์งํ
+```
+
+#### **์ฃผ์ ์ปดํฌ๋ํธ**
+
+**1. ProjectSelector (ํ๋ก์ ํธ ์ ํ๊ธฐ)**
+- ๋๋กญ๋ค์ด ํํ์ ํ๋ก์ ํธ ์ ํ UI
+- ๊ฒ์ ๊ธฐ๋ฅ ์ง์ (ํ๋ก์ ํธ๋ช
, Job ๋ฒํธ)
+- ์งํ๋ฅ ํ์ ๋ฐ ์ํ ํ์
+- ์ ํ๋ ํ๋ก์ ํธ ์ ๋ณด ํ์ด๋ผ์ดํธ
+
+**2. ProjectWorkspacePage (ํ๋ก์ ํธ ์ํฌ์คํ์ด์ค)**
+- ํ๋ก์ ํธ๋ณ ๋ง์ถค ๋์๋ณด๋
+- ๊ถํ ๊ธฐ๋ฐ ์
๋ฌด ๋ฉ๋ด (์นด๋ ํํ)
+- ํ๋ก์ ํธ ํต๊ณ ๋ฐ ์ต๊ทผ ํ๋ ํ์
+- ๋น ๋ฅธ ์์
๋ฒํผ
+
+**3. ๊ถํ๋ณ ์
๋ฌด ๋ฉ๋ด**
+```javascript
+// ์ค๊ณ์ ์
๋ฌด
+- BOM ํ์ผ ์
๋ก๋
+- BOM ๊ด๋ฆฌ
+- ์์ฌ ๋ถ๋ฅ ๊ฒ์ฆ
+
+// ๊ตฌ๋งค์ ์
๋ฌด
+- ๊ตฌ๋งค ๊ด๋ฆฌ
+- ๋ฆฌ๋น์ ๋น๊ต
+- ๊ตฌ๋งค ํ์
+
+// ๊ณตํต ์
๋ฌด
+- ํ๋ก์ ํธ ํํฉ
+- ๋ฆฌํฌํธ ์์ฑ
+```
+
+#### **์ฌ์ฉ์ ๊ฒฝํ ๊ฐ์ ์ฌํญ**
+- **์ง๊ด์ ํ๋ฆ**: ์ค์ ์
๋ฌด ํ๋ฆ๊ณผ ์ผ์นํ๋ ๋ค๋น๊ฒ์ด์
+- **์ปจํ
์คํธ ์ธ์**: ํ๋ก์ ํธ ์ ๋ณด๊ฐ ์๋์ผ๋ก ์ ๋ฌ๋จ
+- **ํจ์จ์ฑ ์ฆ๋**: ๋ถํ์ํ ํ๋ก์ ํธ ์ ํ ๋จ๊ณ ์ ๊ฑฐ
+- **์๊ฐ์ ํผ๋๋ฐฑ**: ์ ํ๋ ํ๋ก์ ํธ์ ์งํ๋ฅ ์๊ฐํ
+
---
## ๐ ๊ฐ๋ฐ ์ํฌํ๋ก์ฐ
-### 1. ์๋ฒ ์คํ ๋ช
๋ น์ด
+### โญ 1. ๋์ปค ์คํ (๊ถ์ฅ - ํ๋ก๋์
ํ๊ฒฝ๊ณผ ๋์ผ)
+```bash
+# TK-MP-Project ๋ฃจํธ ๋๋ ํ ๋ฆฌ์์ ์คํ
+docker-compose up -d
+
+# ๋ก๊ทธ ํ์ธ
+docker-compose logs -f
+
+# ์๋น์ค ์ฌ์์ (์ฝ๋ ๋ณ๊ฒฝ ์)
+docker-compose restart
+
+# ์์ ์ฌ๋น๋ (Dockerfile ๋ณ๊ฒฝ ์)
+docker-compose down
+docker-compose up --build -d
+```
+
+**๋์ปค ์ ์ ์ฃผ์:**
+- ํ๋ก ํธ์๋: http://localhost:13000
+- ๋ฐฑ์๋ API: http://localhost:18000
+- API ๋ฌธ์: http://localhost:18000/docs
+- PostgreSQL: localhost:5432
+- Redis: localhost:6379
+- pgAdmin: http://localhost:5050
+
+### 2. ๋ก์ปฌ ๊ฐ๋ฐ ์คํ (๊ฐ๋ฐ/๋๋ฒ๊น
์ ์ฉ)
```bash
# ๋ฐฑ์๋ ์คํ (ํฐ๋ฏธ๋ 1๋ฒ) - TK-MP-Project ๋ฃจํธ์์
source venv/bin/activate # ๊ฐ์ํ๊ฒฝ ํ์ฑํ (venv๋ ๋ฃจํธ์ ์์)
cd backend
-python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
+python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 18000
# ํ๋ก ํธ์๋ ์คํ (ํฐ๋ฏธ๋ 2๋ฒ) - TK-MP-Project ๋ฃจํธ์์
cd frontend
npm run dev # npm start ์๋!
```
-**์ ์ ์ฃผ์:**
-- ๋ฐฑ์๋ API: http://localhost:8000
-- API ๋ฌธ์: http://localhost:8000/docs
-- ํ๋ก ํธ์๋: http://localhost:5173
+**๋ก์ปฌ ๊ฐ๋ฐ ์ ์ ์ฃผ์:**
+- ๋ฐฑ์๋ API: http://localhost:18000
+- API ๋ฌธ์: http://localhost:18000/docs
+- ํ๋ก ํธ์๋: http://localhost:13000 (ํฌํธ ์ถฉ๋ ์ ์๋ ๋ณ๊ฒฝ๋จ)
-### 2. ๋ฐฑ์๋ ๋ณ๊ฒฝ ์
+### 3. ๋ฐฑ์๋ ๋ณ๊ฒฝ ์
```bash
-# ํญ์ ๊ฐ์ํ๊ฒฝ์์ ์คํ (์ฌ์ฉ์ ์ ํธ์ฌํญ)
+# ๋์ปค ํ๊ฒฝ (๊ถ์ฅ)
+docker-compose restart backend
+
+# ๋ก์ปฌ ํ๊ฒฝ (๋๋ฒ๊น
์ฉ)
cd backend
-python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
+python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 18000
```
### 3. ๋ฐ์ดํฐ๋ฒ ์ด์ค ์คํค๋ง ๋ณ๊ฒฝ ์
@@ -621,6 +1128,18 @@ python -m uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
์: "ํ์ดํ ๊ธธ์ด ๊ณ์ฐ ๋ฐ ์์
๋ด๋ณด๋ด๊ธฐ ๋ฒ๊ทธ ์์ "
```
+### โ ๏ธ ์ค์: ๊ฐ๋ฐ ํ๊ฒฝ ์ ํ ๊ฐ์ด๋
+
+#### ๐ณ ๋์ปค ํ๊ฒฝ (๊ถ์ฅ)
+- **์ฌ์ฉ ์๊ธฐ**: ์ผ๋ฐ์ ์ธ ๊ฐ๋ฐ, ํ
์คํธ, ํ๋ก๋์
๋ฐฐํฌ
+- **์ฅ์ **: ํ๊ฒฝ ์ผ๊ด์ฑ, NAS ๋ฐฐํฌ์ ๋์ผํ ํ๊ฒฝ
+- **๋จ์ **: ๋๋ฒ๊น
์ด ์ฝ๊ฐ ๋ณต์ก
+
+#### ๐ป ๋ก์ปฌ ํ๊ฒฝ (์ ํ์ ์ฌ์ฉ)
+- **์ฌ์ฉ ์๊ธฐ**: ๋ฐฑ์๋ ๋๋ฒ๊น
, ์๋ก์ด ํจํค์ง ํ
์คํธ
+- **์ฅ์ **: ๋น ๋ฅธ ๋๋ฒ๊น
, IDE ํตํฉ
+- **๋จ์ **: ํ๊ฒฝ ์ฐจ์ด๋ก ์ธํ ๋ฐฐํฌ ๋ฌธ์ ๊ฐ๋ฅ์ฑ
+
---
## ๐ฐ ๊ตฌ๋งค ์๋ ๊ณ์ฐ ๊ท์น
@@ -747,25 +1266,48 @@ const purchaseQuantity = Math.ceil(bomQuantity / 5) * 5;
---
-## ๐ ์๋๋ก์ง DSM ๋ฐฐํฌ ๊ฐ์ด๋
+## ๐ ์๋๋ก์ง NAS ๋ฐฐํฌ ๊ฐ์ด๋ โญ
-### ์๋น์ค ๊ตฌ์ฑ
-- **ํ๋ก ํธ์๋**: React + Vite (ํฌํธ 10173)
-- **๋ฐฑ์๋**: FastAPI (ํฌํธ 10080)
-- **๋ฐ์ดํฐ๋ฒ ์ด์ค**: PostgreSQL (ํฌํธ 15432)
-- **์บ์**: Redis (ํฌํธ 16379)
+### ๐ณ ๋์ปค ๊ธฐ๋ฐ ๋ฐฐํฌ (๊ถ์ฅ)
-### ์๋ ๋ฐฐํฌ (๊ถ์ฅ)
+#### ์๋น์ค ๊ตฌ์ฑ
+- **ํ๋ก ํธ์๋**: React + Nginx (ํฌํธ 13000)
+- **๋ฐฑ์๋**: FastAPI + Uvicorn (ํฌํธ 18000)
+- **๋ฐ์ดํฐ๋ฒ ์ด์ค**: PostgreSQL (ํฌํธ 5432)
+- **์บ์**: Redis (ํฌํธ 6379)
+- **๊ด๋ฆฌ๋๊ตฌ**: pgAdmin4 (ํฌํธ 5050)
+
+#### ๋ฐฐํฌ ๋ช
๋ น์ด
```bash
-./deploy-synology.sh 192.168.0.3
+# 1. ํ๋ก์ ํธ ํ์ผ์ NAS๋ก ๋ณต์ฌ
+scp -r TK-MP-Project/ admin@[NAS_IP]:/volume1/docker/
+
+# 2. NAS SSH ์ ์
+ssh admin@[NAS_IP]
+
+# 3. ํ๋ก์ ํธ ๋๋ ํ ๋ฆฌ๋ก ์ด๋
+cd /volume1/docker/TK-MP-Project/
+
+# 4. ๋์ปค ์ปดํฌ์ฆ ์คํ
+docker-compose up -d
+
+# 5. ์๋น์ค ์ํ ํ์ธ
+docker-compose ps
```
-### ์ ์ ํ์ธ
-- ํ๋ก ํธ์๋: http://192.168.0.3:10173
-- ๋ฐฑ์๋ API ๋ฌธ์: http://192.168.0.3:10080/docs
+#### ์ ์ ์ฃผ์ (NAS IP ๊ธฐ์ค)
+- **ํ๋ก ํธ์๋**: http://[NAS_IP]:13000
+- **๋ฐฑ์๋ API**: http://[NAS_IP]:18000
+- **API ๋ฌธ์**: http://[NAS_IP]:18000/docs
+- **pgAdmin**: http://[NAS_IP]:5050
+
+#### ์๋ ๋ฐฐํฌ ์คํฌ๋ฆฝํธ (๊ณง ๊ตฌํ ์์ )
+```bash
+./deploy-synology.sh [NAS_IP]
+```
### ์ฃผ์์ฌํญ
-1. **ํฌํธ ์ถฉ๋**: ์๋๋ก์ง์์ 10080, 10173 ํฌํธ๊ฐ ์ฌ์ฉ ์ค์ด์ง ์์์ง ํ์ธ
+1. **ํฌํธ ์ถฉ๋**: NAS์์ 13000, 18000 ํฌํธ๊ฐ ์ฌ์ฉ ์ค์ด์ง ์์์ง ํ์ธ
2. **๊ถํ**: Docker ๋ช
๋ น์ด๋ `sudo` ๊ถํ ํ์
3. **๋ฐฉํ๋ฒฝ**: DSM ์ ์ดํ์์ ํด๋น ํฌํธ ํ์ฉ ์ค์
4. **๋ฆฌ์์ค**: ๋ฐฑ์๋ ๋น๋ ์ ๋ฉ๋ชจ๋ฆฌ ์ฌ์ฉ๋ ํ์ธ
@@ -1142,4 +1684,324 @@ RUN apt-get update && apt-get install -y \
---
-**๋ง์ง๋ง ์
๋ฐ์ดํธ**: 2025๋
1์ (์ฝ๋ ๊ตฌ์กฐ ์ ๋ฆฌ ๋ฐ ์ปดํฌ๋ํธ ๋ถ๋ฆฌ ์๋ฃ)
+## ๐ **์ฌ์ฉ์ ์ถ์ ๋ฐ ๋ด๋น์ ๊ธฐ๋ก ๊ฐ์ด๋๋ผ์ธ** (2025.01 ์ ๊ท)
+
+### ๐ฏ **๊ธฐ๋ณธ ์์น**
+- **๋ชจ๋ ์
๋ฌด ํ๋์ ๋ด๋น์๊ฐ ๊ธฐ๋ก๋์ด์ผ ํจ**
+- **์ถ์ ๊ฐ๋ฅํ ์
๋ฌด ์ด๋ ฅ ๊ด๋ฆฌ**
+- **๊ฐ์ธ๋ณ ๋ง์ถคํ ๋์๋ณด๋ ์ ๊ณต**
+- **๊ถํ๋ณ ์ฐจ๋ณํ๋ ์ ๋ณด ํ์**
+
+### ๐ **ํ์ ๊ธฐ๋ก ๋์**
+
+#### **1. ํ์ผ ๊ด๋ฆฌ**
+```sql
+-- ํ์ผ ์
๋ก๋ ์ ํ์ ๊ธฐ๋ก
+uploaded_by VARCHAR(100) NOT NULL, -- ์
๋ก๋ํ ์ฌ์ฉ์
+upload_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+updated_by VARCHAR(100), -- ์์ ํ ์ฌ์ฉ์ (ํ์ผ ์์ ์)
+updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+```
+
+#### **2. ํ๋ก์ ํธ ๊ด๋ฆฌ**
+```sql
+-- ํ๋ก์ ํธ ์์ฑ/์์ ์ ํ์ ๊ธฐ๋ก
+created_by VARCHAR(100) NOT NULL, -- ํ๋ก์ ํธ ์์ฑ์
+created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+updated_by VARCHAR(100), -- ๋ง์ง๋ง ์์ ์
+updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+assigned_to VARCHAR(100), -- ํ๋ก์ ํธ ๋ด๋น์
+```
+
+#### **3. ์์ฌ ๊ด๋ฆฌ**
+```sql
+-- ์์ฌ ๋ถ๋ฅ/๊ฒ์ฆ ์ ํ์ ๊ธฐ๋ก
+classified_by VARCHAR(100), -- ์์ฌ ๋ถ๋ฅ ๋ด๋น์
+classified_at TIMESTAMP,
+verified_by VARCHAR(100), -- ๊ฒ์ฆ ๋ด๋น์
+verified_at TIMESTAMP,
+updated_by VARCHAR(100), -- ์์ ๋ด๋น์
+updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+```
+
+#### **4. ๊ตฌ๋งค ๊ด๋ฆฌ**
+```sql
+-- ๊ตฌ๋งค ํ์ /๋ฐ์ฃผ ์ ํ์ ๊ธฐ๋ก
+confirmed_by VARCHAR(100) NOT NULL, -- ๊ตฌ๋งค ํ์ ์
+confirmed_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+ordered_by VARCHAR(100), -- ๋ฐ์ฃผ ๋ด๋น์
+ordered_at TIMESTAMP,
+approved_by VARCHAR(100), -- ์น์ธ์ (๊ณ ์ก ๊ตฌ๋งค ์)
+approved_at TIMESTAMP
+```
+
+### ๐ **๊ถํ๋ณ ์ ๊ทผ ์ ์ด**
+
+#### **๊ด๋ฆฌ์ (admin)**
+- ๋ชจ๋ ํ๋ก์ ํธ ์กฐํ/์์ ๊ฐ๋ฅ
+- ์ฌ์ฉ์ ๊ด๋ฆฌ ๋ฐ ๊ถํ ์ค์
+- ์์คํ
์ค์ ๋ฐ ๋ฐฑ์
๊ด๋ฆฌ
+- ์ ์ฒด ํ๋ ๋ก๊ทธ ์กฐํ
+
+#### **ํ๋ก์ ํธ ๋งค๋์ (manager)**
+- ๋ด๋น ํ๋ก์ ํธ ์ ์ฒด ๊ด๋ฆฌ
+- ํ์ ์
๋ฌด ํ ๋น ๋ฐ ์งํ ์ํฉ ๋ชจ๋ํฐ๋ง
+- ๊ตฌ๋งค ์น์ธ ๊ถํ (์ผ์ ๊ธ์ก ์ดํ)
+- ํ๋ก์ ํธ๋ณ ๋ฆฌํฌํธ ์์ฑ
+
+#### **์ค๊ณ ๋ด๋น์ (designer)**
+- ๋ด๋น ํ๋ก์ ํธ BOM ์
๋ก๋/์์
+- ์์ฌ ๋ถ๋ฅ ๋ฐ ๊ฒ์ฆ
+- ๋ฆฌ๋น์ ๊ด๋ฆฌ
+- ๊ตฌ๋งค ์์ฒญ์ ์์ฑ
+
+#### **๊ตฌ๋งค ๋ด๋น์ (purchaser)**
+- ๊ตฌ๋งค ํ๋ชฉ ์กฐํ ๋ฐ ๋ฐ์ฃผ
+- ๊ณต๊ธ์
์ฒด ๊ด๋ฆฌ
+- ๊ตฌ๋งค ํํฉ ์ถ์
+- ์
๊ณ ๊ด๋ฆฌ
+
+#### **์กฐํ ์ ์ฉ (viewer)**
+- ํ ๋น๋ ํ๋ก์ ํธ ์กฐํ๋ง ๊ฐ๋ฅ
+- ๋ฆฌํฌํธ ๋ค์ด๋ก๋
+- ์งํ ์ํฉ ํ์ธ
+
+### ๐ **๊ฐ์ธ๋ณ ๋์๋ณด๋ ๊ตฌ์ฑ**
+
+#### **1. ๋ง์ถคํ ๋ฐฐ๋ ์์คํ
**
+```javascript
+// ์ฌ์ฉ์๋ณ ๋ง์ถค ์ ๋ณด ํ์
+const personalizedBanner = {
+ admin: {
+ title: "์์คํ
๊ด๋ฆฌ์",
+ metrics: ["์ ์ฒด ํ๋ก์ ํธ ์", "ํ์ฑ ์ฌ์ฉ์ ์", "์์คํ
์ํ"],
+ quickActions: ["์ฌ์ฉ์ ๊ด๋ฆฌ", "์์คํ
์ค์ ", "๋ฐฑ์
๊ด๋ฆฌ"]
+ },
+ manager: {
+ title: "ํ๋ก์ ํธ ๋งค๋์ ",
+ metrics: ["๋ด๋น ํ๋ก์ ํธ", "ํ ์งํ๋ฅ ", "์น์ธ ๋๊ธฐ"],
+ quickActions: ["ํ๋ก์ ํธ ์์ฑ", "ํ ๊ด๋ฆฌ", "์งํ ์ํฉ"]
+ },
+ designer: {
+ title: "์ค๊ณ ๋ด๋น์",
+ metrics: ["๋ด BOM ํ์ผ", "๋ถ๋ฅ ์๋ฃ์จ", "๊ฒ์ฆ ๋๊ธฐ"],
+ quickActions: ["BOM ์
๋ก๋", "์์ฌ ๋ถ๋ฅ", "๋ฆฌ๋น์ ๊ด๋ฆฌ"]
+ },
+ purchaser: {
+ title: "๊ตฌ๋งค ๋ด๋น์",
+ metrics: ["๊ตฌ๋งค ์์ฒญ", "๋ฐ์ฃผ ์๋ฃ", "์
๊ณ ๋๊ธฐ"],
+ quickActions: ["๊ตฌ๋งค ํ์ ", "๋ฐ์ฃผ ๊ด๋ฆฌ", "๊ณต๊ธ์
์ฒด"]
+ }
+};
+```
+
+#### **2. ํ๋ ์ด๋ ฅ ์ถ์ **
+```sql
+-- ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ํ
์ด๋ธ
+CREATE TABLE user_activity_logs (
+ id SERIAL PRIMARY KEY,
+ user_id INTEGER REFERENCES users(user_id),
+ username VARCHAR(100) NOT NULL,
+ activity_type VARCHAR(50) NOT NULL, -- 'FILE_UPLOAD', 'PROJECT_CREATE', 'PURCHASE_CONFIRM' ๋ฑ
+ activity_description TEXT, -- ์์ธ ํ๋ ๋ด์ฉ
+ target_id INTEGER, -- ๋์ ID (ํ์ผ, ํ๋ก์ ํธ ๋ฑ)
+ target_type VARCHAR(50), -- 'FILE', 'PROJECT', 'MATERIAL' ๋ฑ
+ ip_address VARCHAR(45),
+ user_agent TEXT,
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+);
+```
+
+#### **3. ๊ฐ์ธ ์์
ํํฉ**
+- **๋ด๊ฐ ์
๋ก๋ํ ํ์ผ**: ์ต๊ทผ ์
๋ก๋ํ BOM ํ์ผ ๋ชฉ๋ก
+- **๋ด๊ฐ ๋ด๋นํ ํ๋ก์ ํธ**: ํ ๋น๋ ํ๋ก์ ํธ ์งํ ์ํฉ
+- **๋ด ์
๋ฌด ๋๊ธฐ**: ๋ถ๋ฅ/๊ฒ์ฆ/์น์ธ ๋๊ธฐ ์ค์ธ ์
๋ฌด
+- **์ต๊ทผ ํ๋**: ์ต๊ทผ 7์ผ๊ฐ ํ๋ ์์ฝ
+
+### ๐ **๊ตฌํ ์ฐ์ ์์**
+
+#### **Phase 1: ๊ธฐ๋ณธ ์ฌ์ฉ์ ์ถ์ ** (1์ฃผ)
+1. ํ์ฌ ํ
์ด๋ธ์ ๋ด๋น์ ํ๋ ์ถ๊ฐ
+2. ํ์ผ ์
๋ก๋ ์ ์ฌ์ฉ์ ์ ๋ณด ๊ธฐ๋ก
+3. ๊ธฐ๋ณธ ํ๋ ๋ก๊ทธ ์์คํ
๊ตฌ์ถ
+
+#### **Phase 2: ๊ฐ์ธ๋ณ ๋์๋ณด๋** (2์ฃผ)
+1. ๊ถํ๋ณ ๋ง์ถคํ ๋ฐฐ๋ ๊ตฌํ
+2. ๊ฐ์ธ ์์
ํํฉ ํ์ด์ง
+3. ํ๋ ์ด๋ ฅ ์กฐํ ๊ธฐ๋ฅ
+
+#### **Phase 3: ๊ณ ๋ํ** (3์ฃผ)
+1. ์์ธ ๊ถํ ๊ด๋ฆฌ ์์คํ
+2. ํ๋ณ/๋ถ์๋ณ ๋์๋ณด๋
+3. ์
๋ฌด ํ ๋น ๋ฐ ์๋ฆผ ์์คํ
+
+### โ ๏ธ **์ฃผ์์ฌํญ**
+- **๊ฐ์ธ์ ๋ณด ๋ณดํธ**: ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ๋ ์
๋ฌด ๋ชฉ์ ์ผ๋ก๋ง ์ฌ์ฉ
+- **๋ฐ์ดํฐ ๋ณด์กด**: ํ๋ ๋ก๊ทธ๋ ์ต๋ 1๋
๊ฐ ๋ณด์กด ํ ์๋ ์ญ์
+- **์ ๊ทผ ๊ถํ**: ๊ฐ์ธ ํ๋ ์ด๋ ฅ์ ๋ณธ์ธ๊ณผ ๊ด๋ฆฌ์๋ง ์กฐํ ๊ฐ๋ฅ
+- **๊ฐ์ฌ ์ถ์ **: ์ค์ ์
๋ฌด(๊ตฌ๋งค ํ์ , ํ๋ก์ ํธ ์ญ์ ๋ฑ)๋ ๋ณ๋ ๊ฐ์ฌ ๋ก๊ทธ ์ ์ง
+
+---
+
+## ๐จ **ํ๋ก๋์
๋ฐฐํฌ ์ ํ์ ๋ณด์ ์ฒดํฌ๋ฆฌ์คํธ** (2025.01 ์ ๊ท)
+
+> โ ๏ธ **์ค์**: ํ์ฌ ์ค์ ์ **ํ
์คํธ ํ๊ฒฝ์ฉ**์
๋๋ค. ์ค์ ์๋น์ค ๋ฐฐํฌ ์ ๋ฐ๋์ ์๋ ํญ๋ชฉ๋ค์ ์์ ํด์ผ ํฉ๋๋ค.
+
+### ๐ **Critical Security Items (๋ฐฐํฌ ์ ํ์)**
+
+#### **1. JWT ์ํฌ๋ฆฟ ํค ํ๊ฒฝ๋ณ์ํ**
+```bash
+# โ ํ์ฌ (ํ
์คํธ์ฉ)
+SECRET_KEY = "test-secret-key"
+
+# โ
๋ฐฐํฌ ์ ํ์ ๋ณ๊ฒฝ
+JWT_SECRET_KEY=your-super-secure-random-key-here # .env ํ์ผ์ ์ถ๊ฐ
+```
+
+#### **2. ๋ฐ์ดํฐ๋ฒ ์ด์ค ๋น๋ฐ๋ฒํธ ๋ณด์**
+```yaml
+# โ ํ์ฌ (ํ
์คํธ์ฉ)
+POSTGRES_PASSWORD: tkmp_password_2025
+
+# โ
๋ฐฐํฌ ์ ํ์ ๋ณ๊ฒฝ
+POSTGRES_PASSWORD: ${DB_PASSWORD} # ํ๊ฒฝ๋ณ์๋ก ๋ถ๋ฆฌ
+```
+
+#### **3. CORS ๋๋ฉ์ธ ์ค์ **
+```python
+# โ ํ์ฌ (ํ
์คํธ์ฉ)
+"production": [
+ "https://your-domain.com",
+ "https://api.your-domain.com"
+]
+
+# โ
๋ฐฐํฌ ์ ํ์ ๋ณ๊ฒฝ
+"production": [
+ "https://์ค์ ๋๋ฉ์ธ.com",
+ "https://api.์ค์ ๋๋ฉ์ธ.com"
+]
+```
+
+#### **4. ๊ธฐ๋ณธ ๊ด๋ฆฌ์ ๊ณ์ ๋ณ๊ฒฝ**
+```sql
+-- โ ํ์ฌ (ํ
์คํธ์ฉ)
+INSERT INTO users (username, password) VALUES ('admin', 'admin123');
+
+-- โ
๋ฐฐํฌ ์ ํ์ ๋ณ๊ฒฝ
+-- ๊ฐ๋ ฅํ ๋น๋ฐ๋ฒํธ๋ก ๋ณ๊ฒฝ ๋ฐ ํ
์คํธ ๊ณ์ ์ญ์
+```
+
+### ๐ก๏ธ **๋ฐฐํฌ ์ ๋ณด์ ์ฒดํฌ๋ฆฌ์คํธ**
+
+- [ ] **ํ๊ฒฝ๋ณ์ ๋ถ๋ฆฌ**: ๋ชจ๋ ๋ฏผ๊ฐ ์ ๋ณด๋ฅผ .env ํ์ผ๋ก ๋ถ๋ฆฌ
+- [ ] **HTTPS ์ ์ฉ**: SSL ์ธ์ฆ์ ์ค์น ๋ฐ HTTP โ HTTPS ๋ฆฌ๋ค์ด๋ ํธ
+- [ ] **๋ฐฉํ๋ฒฝ ์ค์ **: ํ์ํ ํฌํธ๋ง ๊ฐ๋ฐฉ (80, 443, SSH)
+- [ ] **๋ฐ์ดํฐ๋ฒ ์ด์ค ์ ๊ทผ ์ ํ**: ์ธ๋ถ ์ ๊ทผ ์ฐจ๋จ, ์ ํ๋ฆฌ์ผ์ด์
์์๋ง ์ ๊ทผ
+- [ ] **๋ก๊ทธ ํ์ผ ๋ณด์**: ๋ฏผ๊ฐ ์ ๋ณด ๋ก๊น
๋ฐฉ์ง, ๋ก๊ทธ ํ์ผ ๊ถํ ์ค์
+- [ ] **๋ฐฑ์
์ ๋ต**: ์ ๊ธฐ ๋ฐฑ์
๋ฐ ๋ณต๊ตฌ ํ
์คํธ
+- [ ] **๋ชจ๋ํฐ๋ง**: ์์คํ
์ํ ๋ฐ ๋ณด์ ์ด๋ฒคํธ ๋ชจ๋ํฐ๋ง
+- [ ] **์
๋ฐ์ดํธ ๊ณํ**: ๋ณด์ ํจ์น ๋ฐ ์์กด์ฑ ์
๋ฐ์ดํธ ๊ณํ
+
+### ๐ **๋ฐฐํฌ ํ๊ฒฝ๋ณ ์ค์ ๊ฐ์ด๋**
+
+#### **๊ฐ๋ฐ ํ๊ฒฝ (ํ์ฌ)**
+```bash
+ENVIRONMENT=development
+DEBUG=true
+CORS_ORIGINS=http://localhost:3000,http://localhost:13000
+```
+
+#### **์คํ
์ด์ง ํ๊ฒฝ**
+```bash
+ENVIRONMENT=staging
+DEBUG=false
+CORS_ORIGINS=https://staging.your-domain.com
+```
+
+#### **ํ๋ก๋์
ํ๊ฒฝ**
+```bash
+ENVIRONMENT=production
+DEBUG=false
+CORS_ORIGINS=https://your-domain.com
+JWT_SECRET_KEY=๊ฐ๋ ฅํ-๋๋ค-ํค
+DB_PASSWORD=๊ฐ๋ ฅํ-๋ฐ์ดํฐ๋ฒ ์ด์ค-๋น๋ฐ๋ฒํธ
+```
+
+---
+
+## ๐ **API ์ ๋ฆฌ ์์ฝ** (2025.09.05 ์๋ฃ)
+
+### โ
**ํด๊ฒฐ๋ ๋ฌธ์ ๋ค**
+1. **404 ์ค๋ฅ ํด๊ฒฐ**: `/files/materials` โ `/files/materials-v2` ๋ง์ด๊ทธ๋ ์ด์
+2. **API ํธ์ถ ํ์คํ**: ์ง์ ํธ์ถ โ `fetchMaterials()` ํจ์ ์ฌ์ฉ
+3. **ํผ๋ ๋ฐฉ์ง**: ๋ช
ํํ API ์ฌ์ฉ ๊ฐ์ด๋๋ผ์ธ ์๋ฆฝ
+4. **๋ฌธ์ํ ์์ฑ**: ์ค์ ๊ตฌํ๋ ๋ชจ๋ API ์๋ํฌ์ธํธ ์ ๋ฆฌ
+
+### ๐ฏ **ํ์คํ๋ ์ฌ์ฉ๋ฒ**
+```javascript
+// โ
๊ถ์ฅ ๋ฐฉ๋ฒ
+import { fetchMaterials, fetchFiles, fetchJobs } from '../api';
+
+// ํ์ผ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({ file_id: 123 });
+
+// ํ๋ก์ ํธ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({ job_no: 'J24-001' });
+
+// ๋ฆฌ๋น์ ๋ณ ์์ฌ ์กฐํ
+const materials = await fetchMaterials({
+ job_no: 'J24-001',
+ revision: 'Rev.1'
+});
+```
+
+### ๐จ **์ค์ ๊ท์น**
+- **๋ชจ๋ ์์ฌ API ํธ์ถ์ `fetchMaterials()` ํจ์ ์ฌ์ฉ**
+- **์ง์ API ํธ์ถ ๊ธ์ง** (ํน๋ณํ ๊ฒฝ์ฐ ์ ์ธ)
+- **์ API ์ถ๊ฐ ์ RULES.md ์ฆ์ ์
๋ฐ์ดํธ**
+- **API ๋ณ๊ฒฝ ์ ํ์ ํธํ์ฑ ๊ณ ๋ ค**
+
+---
+
+## ๐ ์์ฌ ๋ถ๋ฅ ๊ท์น
+
+### ํต์ฌ ๋ถ๋ฅ ์์น
+
+#### 1. ๋ํ(NIPPLE) ํน์ ๊ท์น โ ๏ธ
+- **๋ถ๋ฅ ๋ฐฉ์**: ํ์ดํ ๋ถ๋ฅ๊ธฐ(pipe_classifier)๋ก ๋ถ๋ฅํ์ง๋ง **์นดํ
๊ณ ๋ฆฌ๋ FITTING์ผ๋ก ์ฒ๋ฆฌ**
+- **์ด์ **: ๋ํ์ ํ์ดํ์ ๋์ผํ ์ฌ์ง/์คํ์ ๊ฐ์ง์ง๋ง, ์ฉ๋์ ํผํ
๋ฅ๋ก ์ทจ๊ธ
+- **๊ธธ์ด ๊ธฐ๋ฐ ๊ทธ๋ฃนํ**: ๊ฐ์ ์คํ์ด๋ผ๋ ๊ธธ์ด๊ฐ ๋ค๋ฅด๋ฉด ๋ณ๋ ํญ๋ชฉ์ผ๋ก ๋ถ๋ฆฌ
+ - ์: `NIPPLE 1" 75mm` vs `NIPPLE 1" 100mm`
+- **์ด๊ธธ์ด ๊ณ์ฐ**: ๊ฐ๋ณ ๋ํ ๊ธธ์ด ร ์๋์ ํฉ์ฐํ์ฌ ์ค์ ์ด๊ธธ์ด ํ์
+- **๋๋จ ๊ฐ๊ณต ์ฒ๋ฆฌ**: ํ์ดํ์ ๋์ผํ๊ฒ PBE, BBE, POE ๋ฑ ๋๋จ ๊ฐ๊ณต ์ ๋ณด ๋ถ๋ฆฌ ์ ์ฅ
+- **๊ทธ๋ฃนํ ํค**: `clean_description|size_spec|material_grade|length_mm`
+
+#### 2. ํ์ดํ(PIPE) ๋ถ๋ฅ ๊ท์น
+- **๊ทธ๋ฃนํ ํค**: `clean_description|size_spec|material_grade`
+- **๋๋จ ๊ฐ๊ณต ์ ์ธ**: ๊ตฌ๋งค์ฉ ๊ทธ๋ฃนํ์์๋ BBE, POE, PBE ๋ฑ ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ์ธ
+- **๊ฐ๋ณ ์ ๋ณด ๋ณด์กด**: ๊ฐ ํ์ดํ์ ๋๋จ ๊ฐ๊ณต ์ ๋ณด๋ `pipe_end_preparations` ํ
์ด๋ธ์ ๋ณ๋ ์ ์ฅ
+- **์ด๊ธธ์ด ๊ณ์ฐ**: ๋์ผ ์คํ ํ์ดํ๋ค์ ๊ฐ๋ณ ๊ธธ์ด ํฉ์ฐ
+
+#### 3. ๊ธฐํ ํผํ
(FITTING) ๋ถ๋ฅ ๊ท์น
+- **์ผ๋ฐ ํผํ
**: ์๋ ๊ธฐ๋ฐ ์ง๊ณ (ELBOW, TEE, REDUCER ๋ฑ)
+- **๊ธธ์ด ์ ๋ณด ์์**: ๋ํ์ ์ ์ธํ ์ผ๋ฐ ํผํ
์ ๊ธธ์ด ๊ธฐ๋ฐ ๊ทธ๋ฃนํ ๋ถํ์
+
+### ๋ถ๋ฅ ์ฐ์ ์์
+1. **PIPE**: ํ์ดํ ๋ถ๋ฅ๊ธฐ ์ฐ์ ์ ์ฉ
+2. **FITTING**: ๋ํ ํฌํจ, ํผํ
๋ถ๋ฅ๊ธฐ ์ ์ฉ
+3. **VALVE**: ๋ฐธ๋ธ ๋ถ๋ฅ๊ธฐ ์ ์ฉ
+4. **FLANGE**: ํ๋์ง ๋ถ๋ฅ๊ธฐ ์ ์ฉ
+5. **BOLT**: ๋ณผํธ ๋ถ๋ฅ๊ธฐ ์ ์ฉ
+6. **GASKET**: ๊ฐ์ค์ผ ๋ถ๋ฅ๊ธฐ ์ ์ฉ
+7. **INSTRUMENT**: ๊ณ๊ธฐ ๋ถ๋ฅ๊ธฐ ์ ์ฉ
+
+### ๋๋จ ๊ฐ๊ณต ์ฝ๋ ์ ์
+- **PBE**: Plain Both Ends (์์ชฝ ๋ฌด๊ฐ์ ) - ๊ธฐ๋ณธ๊ฐ
+- **BBE**: Both Ends Beveled (์์ชฝ ๊ฐ์ )
+- **POE**: Plain One End (ํ์ชฝ ๋ฌด๊ฐ์ )
+- **BOE**: Beveled One End (ํ์ชฝ ๊ฐ์ )
+- **TOE**: Threaded One End (ํ์ชฝ ๋์ฌ)
+
+---
+
+**๋ง์ง๋ง ์
๋ฐ์ดํธ**: 2025๋
9์ (์์ฌ ๋ถ๋ฅ ๊ท์น ๋ฐ API ์ ๋ฆฌ ์๋ฃ)
diff --git a/backend/app.py b/backend/app.py
deleted file mode 100644
index ee620fa..0000000
--- a/backend/app.py
+++ /dev/null
@@ -1,166 +0,0 @@
-from flask import Flask, request, jsonify
-import psycopg2
-from contextlib import contextmanager
-
-app = Flask(__name__)
-
-@contextmanager
-def get_db_connection():
- conn = psycopg2.connect(
- host="localhost",
- database="tkmp_db",
- user="tkmp_user",
- password="tkmp2024!",
- port="5432"
- )
- try:
- yield conn
- finally:
- conn.close()
-
-@app.route('/')
-def home():
- return {"message": "API ์๋ ์ค"}
-
-@app.route('/api/materials')
-def get_materials():
- job_number = request.args.get('job_number')
-
- if not job_number:
- return {"error": "job_number ํ์"}, 400
-
- try:
- with get_db_connection() as conn:
- cur = conn.cursor()
-
- cur.execute("""
- SELECT id, job_number, item_number, description,
- category, quantity, unit, created_at
- FROM materials
- WHERE job_number = %s
- ORDER BY item_number
- """, (job_number,))
-
- rows = cur.fetchall()
-
- materials = []
- for r in rows:
- item = {
- 'id': r[0],
- 'job_number': r[1],
- 'item_number': r[2],
- 'description': r[3],
- 'category': r[4],
- 'quantity': r[5],
- 'unit': r[6],
- 'created_at': str(r[7]) if r[7] else None
- }
- materials.append(item)
-
- return {
- 'success': True,
- 'data': materials,
- 'count': len(materials)
- }
-
- except Exception as e:
- return {"error": f"DB ์ค๋ฅ: {str(e)}"}, 500
-
-if __name__ == '__main__':
- print("๐ ์๋ฒ ์์: http://localhost:5000")
- app.run(debug=True, port=5000)
-# ์์ ๋ get_materials API (์ฌ๋ฐ๋ฅธ ์ปฌ๋ผ๋ช
์ฌ์ฉ)
-@app.route('/api/materials-fixed', methods=['GET'])
-def get_materials_fixed():
- """์ฌ๋ฐ๋ฅธ ์ปฌ๋ผ๋ช
์ ์ฌ์ฉํ ์์ฌ ์กฐํ API"""
- try:
- file_id = request.args.get('file_id')
-
- if not file_id:
- return jsonify({
- 'success': False,
- 'error': 'file_id parameter is required'
- }), 400
-
- with get_db_connection() as conn:
- cur = conn.cursor()
-
- cur.execute("""
- SELECT
- id, file_id, line_number, original_description,
- classified_category, classified_subcategory,
- quantity, unit, created_at
- FROM materials
- WHERE file_id = %s
- ORDER BY line_number
- """, (file_id,))
-
- materials = []
- for item in cur.fetchall():
- material = {
- 'id': item[0],
- 'file_id': item[1],
- 'line_number': item[2],
- 'original_description': item[3],
- 'classified_category': item[4],
- 'classified_subcategory': item[5],
- 'quantity': float(item[6]) if item[6] else 0,
- 'unit': item[7],
- 'created_at': item[8].isoformat() if item[8] else None
- }
- materials.append(material)
-
- return jsonify({
- 'success': True,
- 'data': materials,
- 'count': len(materials),
- 'file_id': file_id
- })
-
- except Exception as e:
- print(f"Error in get_materials_fixed: {e}")
- return jsonify({
- 'success': False,
- 'error': str(e)
- }), 500
-
-@app.get("/api/materials-test")
-def get_materials_test(file_id: int):
- """ํ
์คํธ์ฉ ์์ฌ ์กฐํ API"""
- try:
- with get_db_connection() as conn:
- cur = conn.cursor()
-
- cur.execute("""
- SELECT
- id, file_id, line_number, original_description,
- classified_category, quantity, unit
- FROM materials
- WHERE file_id = %s
- ORDER BY line_number
- LIMIT 5
- """, (file_id,))
-
- rows = cur.fetchall()
-
- materials = []
- for r in rows:
- materials.append({
- 'id': r[0],
- 'file_id': r[1],
- 'line_number': r[2],
- 'description': r[3],
- 'category': r[4],
- 'quantity': float(r[5]) if r[5] else 0,
- 'unit': r[6]
- })
-
- return {
- 'success': True,
- 'data': materials,
- 'count': len(materials)
- }
-
- except Exception as e:
- return {'error': str(e)}
-
diff --git a/backend/app/api/file_management.py b/backend/app/api/file_management.py
deleted file mode 100644
index 25fc580..0000000
--- a/backend/app/api/file_management.py
+++ /dev/null
@@ -1,56 +0,0 @@
-"""
-ํ์ผ ๊ด๋ฆฌ API
-main.py์์ ๋ถ๋ฆฌ๋ ํ์ผ ๊ด๋ จ ์๋ํฌ์ธํธ๋ค
-"""
-from fastapi import APIRouter, Depends
-from sqlalchemy import text
-from sqlalchemy.orm import Session
-from typing import Optional
-
-from ..database import get_db
-from ..utils.logger import get_logger
-from ..schemas import FileListResponse, FileDeleteResponse, FileInfo
-from ..services.file_service import get_file_service
-
-router = APIRouter()
-logger = get_logger(__name__)
-
-
-@router.get("/files", response_model=FileListResponse)
-async def get_files(
- job_no: Optional[str] = None,
- show_history: bool = False,
- use_cache: bool = True,
- db: Session = Depends(get_db)
-) -> FileListResponse:
- """ํ์ผ ๋ชฉ๋ก ์กฐํ (BOM๋ณ ๊ทธ๋ฃนํ)"""
- file_service = get_file_service(db)
-
- # ์๋น์ค ๋ ์ด์ด ํธ์ถ
- files, cache_hit = await file_service.get_files(job_no, show_history, use_cache)
-
- return FileListResponse(
- success=True,
- message="ํ์ผ ๋ชฉ๋ก ์กฐํ ์ฑ๊ณต" + (" (์บ์)" if cache_hit else ""),
- data=files,
- total_count=len(files),
- cache_hit=cache_hit
- )
-
-
-@router.delete("/files/{file_id}", response_model=FileDeleteResponse)
-async def delete_file(
- file_id: int,
- db: Session = Depends(get_db)
-) -> FileDeleteResponse:
- """ํ์ผ ์ญ์ """
- file_service = get_file_service(db)
-
- # ์๋น์ค ๋ ์ด์ด ํธ์ถ
- result = await file_service.delete_file(file_id)
-
- return FileDeleteResponse(
- success=result["success"],
- message=result["message"],
- deleted_file_id=result["deleted_file_id"]
- )
diff --git a/backend/app/api/files.py b/backend/app/api/files.py
deleted file mode 100644
index f1ce8fc..0000000
--- a/backend/app/api/files.py
+++ /dev/null
@@ -1,1180 +0,0 @@
-from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form
-from sqlalchemy.orm import Session
-from sqlalchemy import text
-from typing import List, Optional
-import os
-import shutil
-from datetime import datetime
-import uuid
-import pandas as pd
-import re
-import json
-from pathlib import Path
-
-from ..database import get_db
-from app.services.material_classifier import classify_material
-from app.services.bolt_classifier import classify_bolt
-from app.services.flange_classifier import classify_flange
-from app.services.fitting_classifier import classify_fitting
-from app.services.gasket_classifier import classify_gasket
-from app.services.instrument_classifier import classify_instrument
-from app.services.pipe_classifier import classify_pipe
-from app.services.valve_classifier import classify_valve
-
-router = APIRouter()
-
-UPLOAD_DIR = Path("uploads")
-UPLOAD_DIR.mkdir(exist_ok=True)
-ALLOWED_EXTENSIONS = {".xlsx", ".xls", ".csv"}
-
-@router.get("/")
-async def get_files_info():
- return {
- "message": "ํ์ผ ๊ด๋ฆฌ API",
- "allowed_extensions": list(ALLOWED_EXTENSIONS),
- "upload_directory": str(UPLOAD_DIR)
- }
-
-@router.get("/test")
-async def test_endpoint():
- return {"status": "ํ์ผ API๊ฐ ์ ์ ์๋ํฉ๋๋ค!"}
-
-@router.post("/add-missing-columns")
-async def add_missing_columns(db: Session = Depends(get_db)):
- """๋๋ฝ๋ ์ปฌ๋ผ๋ค ์ถ๊ฐ"""
- try:
- db.execute(text("ALTER TABLE files ADD COLUMN IF NOT EXISTS parsed_count INTEGER DEFAULT 0"))
- db.execute(text("ALTER TABLE materials ADD COLUMN IF NOT EXISTS row_number INTEGER"))
- db.commit()
-
- return {
- "success": True,
- "message": "๋๋ฝ๋ ์ปฌ๋ผ๋ค์ด ์ถ๊ฐ๋์์ต๋๋ค",
- "added_columns": ["files.parsed_count", "materials.row_number"]
- }
- except Exception as e:
- db.rollback()
- return {"success": False, "error": f"์ปฌ๋ผ ์ถ๊ฐ ์คํจ: {str(e)}"}
-
-def validate_file_extension(filename: str) -> bool:
- return Path(filename).suffix.lower() in ALLOWED_EXTENSIONS
-
-def generate_unique_filename(original_filename: str) -> str:
- timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
- unique_id = str(uuid.uuid4())[:8]
- stem = Path(original_filename).stem
- suffix = Path(original_filename).suffix
- return f"{stem}_{timestamp}_{unique_id}{suffix}"
-
-def parse_dataframe(df):
- df = df.dropna(how='all')
- # ์๋ณธ ์ปฌ๋ผ๋ช
์ ์ง (์๋ฌธ์ ๋ณํํ์ง ์์)
- df.columns = df.columns.str.strip()
-
- column_mapping = {
- 'description': ['description', 'item', 'material', 'ํ๋ช
', '์์ฌ๋ช
'],
- 'quantity': ['qty', 'quantity', 'ea', '์๋'],
- 'main_size': ['main_nom', 'nominal_diameter', 'nd', '์ฃผ๋ฐฐ๊ด'],
- 'red_size': ['red_nom', 'reduced_diameter', '์ถ์๋ฐฐ๊ด'],
- 'length': ['length', 'len', '๊ธธ์ด'],
- 'weight': ['weight', 'wt', '์ค๋'],
- 'dwg_name': ['dwg_name', 'drawing', '๋๋ฉด๋ช
'],
- 'line_num': ['line_num', 'line_number', '๋ผ์ธ๋ฒํธ']
- }
-
- mapped_columns = {}
- for standard_col, possible_names in column_mapping.items():
- for possible_name in possible_names:
- # ๋์๋ฌธ์ ๊ตฌ๋ถ ์์ด ๋งคํ
- for col in df.columns:
- if possible_name.lower() == col.lower():
- mapped_columns[standard_col] = col
- break
- if standard_col in mapped_columns:
- break
-
- print(f"์ฐพ์ ์ปฌ๋ผ ๋งคํ: {mapped_columns}")
-
- materials = []
- for index, row in df.iterrows():
- description = str(row.get(mapped_columns.get('description', ''), ''))
- quantity_raw = row.get(mapped_columns.get('quantity', ''), 0)
-
- try:
- quantity = float(quantity_raw) if pd.notna(quantity_raw) else 0
- except:
- quantity = 0
-
- # ๊ธธ์ด ์ ๋ณด ํ์ฑ
- length_raw = row.get(mapped_columns.get('length', ''), None)
- length_value = None
- if pd.notna(length_raw) and length_raw != '':
- try:
- length_value = float(length_raw)
- except:
- length_value = None
-
- material_grade = ""
- if "ASTM" in description.upper():
- astm_match = re.search(r'ASTM\s+([A-Z0-9\s]+)', description.upper())
- if astm_match:
- material_grade = astm_match.group(0).strip()
-
- main_size = str(row.get(mapped_columns.get('main_size', ''), ''))
- red_size = str(row.get(mapped_columns.get('red_size', ''), ''))
-
- if main_size != 'nan' and red_size != 'nan' and red_size != '':
- size_spec = f"{main_size} x {red_size}"
- elif main_size != 'nan' and main_size != '':
- size_spec = main_size
- else:
- size_spec = ""
-
- if description and description not in ['nan', 'None', '']:
- materials.append({
- 'original_description': description,
- 'quantity': quantity,
- 'unit': "EA",
- 'size_spec': size_spec,
- 'material_grade': material_grade,
- 'length': length_value,
- 'line_number': index + 1,
- 'row_number': index + 1
- })
-
- return materials
-
-def parse_file_data(file_path):
- file_extension = Path(file_path).suffix.lower()
-
- try:
- if file_extension == ".csv":
- df = pd.read_csv(file_path, encoding='utf-8')
- elif file_extension in [".xlsx", ".xls"]:
- df = pd.read_excel(file_path, sheet_name=0)
- else:
- raise HTTPException(status_code=400, detail="์ง์ํ์ง ์๋ ํ์ผ ํ์")
-
- return parse_dataframe(df)
- except Exception as e:
- raise HTTPException(status_code=400, detail=f"ํ์ผ ํ์ฑ ์คํจ: {str(e)}")
-
-@router.post("/upload")
-async def upload_file(
- file: UploadFile = File(...),
- project_id: int = Form(...),
- revision: str = Form("Rev.0"),
- db: Session = Depends(get_db)
-):
- if not validate_file_extension(str(file.filename)):
- raise HTTPException(
- status_code=400,
- detail=f"์ง์ํ์ง ์๋ ํ์ผ ํ์์
๋๋ค. ํ์ฉ๋ ํ์ฅ์: {', '.join(ALLOWED_EXTENSIONS)}"
- )
-
- if file.size and file.size > 10 * 1024 * 1024:
- raise HTTPException(status_code=400, detail="ํ์ผ ํฌ๊ธฐ๋ 10MB๋ฅผ ์ด๊ณผํ ์ ์์ต๋๋ค")
-
- unique_filename = generate_unique_filename(str(file.filename))
- file_path = UPLOAD_DIR / unique_filename
-
- try:
- with open(file_path, "wb") as buffer:
- shutil.copyfileobj(file.file, buffer)
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ ์ฅ ์คํจ: {str(e)}")
-
- try:
- materials_data = parse_file_data(str(file_path))
- parsed_count = len(materials_data)
-
- # ํ์ผ ์ ๋ณด ์ ์ฅ
- file_insert_query = text("""
- INSERT INTO files (filename, original_filename, file_path, project_id, revision, description, file_size, parsed_count, is_active)
- VALUES (:filename, :original_filename, :file_path, :project_id, :revision, :description, :file_size, :parsed_count, :is_active)
- RETURNING id
- """)
-
- file_result = db.execute(file_insert_query, {
- "filename": unique_filename,
- "original_filename": file.filename,
- "file_path": str(file_path),
- "project_id": project_id,
- "revision": revision,
- "description": f"BOM ํ์ผ - {parsed_count}๊ฐ ์์ฌ",
- "file_size": file.size,
- "parsed_count": parsed_count,
- "is_active": True
- })
-
- file_id = file_result.fetchone()[0]
-
- # ์์ฌ ๋ฐ์ดํฐ ์ ์ฅ (๋ถ๋ฅ ํฌํจ)
- materials_inserted = 0
- for material_data in materials_data:
- # ์์ฌ ํ์
๋ถ๋ฅ๊ธฐ ์ ์ฉ (PIPE, FITTING, VALVE ๋ฑ)
- description = material_data["original_description"]
- size_spec = material_data["size_spec"]
-
- # ๊ฐ ๋ถ๋ฅ๊ธฐ๋ก ์๋ (์ฌ๋ฐ๋ฅธ ๋งค๊ฐ๋ณ์ ์ฌ์ฉ)
- print(f"๋ถ๋ฅ ์๋: {description}")
-
- # ๋ถ๋ฅ๊ธฐ ํธ์ถ ์ ํ์์์ ๋ฐ ์์ธ ์ฒ๋ฆฌ
- classification_result = None
- try:
- # ํ์ดํ ๋ถ๋ฅ๊ธฐ ํธ์ถ ์ length ๋งค๊ฐ๋ณ์ ์ ๋ฌ
- length_value = None
- if 'length' in material_data:
- try:
- length_value = float(material_data['length'])
- except:
- length_value = None
- # None์ด๋ฉด 0.0์ผ๋ก ๋์ฒด
- if length_value is None:
- length_value = 0.0
-
- # ํ์์์ ์ค์ (10์ด)
- import signal
- def timeout_handler(signum, frame):
- raise TimeoutError("๋ถ๋ฅ๊ธฐ ์คํ ์๊ฐ ์ด๊ณผ")
-
- signal.signal(signal.SIGALRM, timeout_handler)
- signal.alarm(10) # 10์ด ํ์์์
-
- try:
- classification_result = classify_pipe("", description, size_spec, length_value)
- print(f"PIPE ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0) # ํ์์์ ํด์
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_fitting("", description, size_spec)
- print(f"FITTING ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_valve("", description, size_spec)
- print(f"VALVE ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_flange("", description, size_spec)
- print(f"FLANGE ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_bolt("", description, size_spec)
- print(f"BOLT ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_gasket("", description, size_spec)
- print(f"GASKET ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- if classification_result.get("overall_confidence", 0) < 0.5:
- signal.alarm(10)
- try:
- classification_result = classify_instrument("", description, size_spec)
- print(f"INSTRUMENT ๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')} (์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)})")
- finally:
- signal.alarm(0)
-
- except (TimeoutError, Exception) as e:
- print(f"๋ถ๋ฅ๊ธฐ ์คํ ์ค ์ค๋ฅ ๋ฐ์: {e}")
- # ๊ธฐ๋ณธ ๋ถ๋ฅ ๊ฒฐ๊ณผ ์์ฑ
- classification_result = {
- "category": "UNKNOWN",
- "overall_confidence": 0.0,
- "reason": f"๋ถ๋ฅ๊ธฐ ์ค๋ฅ: {str(e)}"
- }
-
- print(f"์ต์ข
๋ถ๋ฅ ๊ฒฐ๊ณผ: {classification_result.get('category', 'UNKNOWN')}")
-
- # ๋ถ๋ฅ ๊ฒฐ๊ณผ์์ ์์ธ ์ ๋ณด ์ถ์ถ
- if classification_result.get('category') == 'PIPE':
- classification_details = classification_result
- elif classification_result.get('category') == 'FITTING':
- classification_details = classification_result
- elif classification_result.get('category') == 'VALVE':
- classification_details = classification_result
- else:
- classification_details = {}
- # DB์ ์ ์ฅ ์ JSON ์ง๋ ฌํ
- classification_details = json.dumps(classification_details, ensure_ascii=False)
-
- # ๋๋ฒ๊น
: ์ ์ฅ ์ง์ ๋ฐ์ดํฐ ํ์ธ
- print(f"=== ์์ฌ[{materials_inserted + 1}] ์ ์ฅ ์ง์ ===")
- print(f"์์ฌ๋ช
: {material_data['original_description']}")
- print(f"๋ถ๋ฅ๊ฒฐ๊ณผ: {classification_result.get('category')}")
- print(f"์ ๋ขฐ๋: {classification_result.get('overall_confidence', 0)}")
- print(f"classification_details ๊ธธ์ด: {len(classification_details)}")
- print(f"classification_details ์ํ: {classification_details[:200]}...")
- print("=" * 50)
- material_insert_query = text("""
- INSERT INTO materials (
- file_id, original_description, quantity, unit, size_spec,
- material_grade, line_number, row_number, classified_category,
- classification_confidence, classification_details, is_verified, created_at
- )
- VALUES (
- :file_id, :original_description, :quantity, :unit, :size_spec,
- :material_grade, :line_number, :row_number, :classified_category,
- :classification_confidence, :classification_details, :is_verified, :created_at
- )
- """)
-
- db.execute(material_insert_query, {
- "file_id": file_id,
- "original_description": material_data["original_description"],
- "quantity": material_data["quantity"],
- "unit": material_data["unit"],
- "size_spec": material_data["size_spec"],
- "material_grade": material_data["material_grade"],
- "line_number": material_data["line_number"],
- "row_number": material_data["row_number"],
- "classified_category": classification_result.get("category", "UNKNOWN"),
- "classification_confidence": classification_result.get("overall_confidence", 0.0),
- "classification_details": classification_details,
- "is_verified": False,
- "created_at": datetime.now()
- })
-
- # ๊ฐ ์นดํ
๊ณ ๋ฆฌ๋ณ๋ก ์์ธ ํ
์ด๋ธ์ ์ ์ฅ
- category = classification_result.get('category')
- confidence = classification_result.get('overall_confidence', 0)
-
- if category == 'PIPE' and confidence >= 0.5:
- try:
- # ๋ถ๋ฅ ๊ฒฐ๊ณผ์์ ํ์ดํ ์์ธ ์ ๋ณด ์ถ์ถ
- pipe_info = classification_result
-
- # cutting_dimensions์์ length ์ ๋ณด ๊ฐ์ ธ์ค๊ธฐ
- cutting_dims = pipe_info.get('cutting_dimensions', {})
- length_mm = cutting_dims.get('length_mm')
-
- # length_mm๊ฐ ์์ผ๋ฉด ์๋ณธ ๋ฐ์ดํฐ์ length ์ฌ์ฉ
- if not length_mm and material_data.get('length'):
- length_mm = material_data['length']
-
- pipe_insert_query = text("""
- INSERT INTO pipe_details (
- material_id, file_id, outer_diameter, schedule,
- material_spec, manufacturing_method, length_mm
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :outer_diameter, :schedule,
- :material_spec, :manufacturing_method, :length_mm
- )
- """)
-
- db.execute(pipe_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "outer_diameter": pipe_info.get('nominal_diameter', ''),
- "schedule": pipe_info.get('schedule', ''),
- "material_spec": pipe_info.get('material_spec', ''),
- "manufacturing_method": pipe_info.get('manufacturing_method', ''),
- "length_mm": length_mm,
-
- })
-
- print(f"PIPE ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"PIPE ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
- # ์๋ฌ๊ฐ ๋ฐ์ํด๋ ์ ์ฒด ํ๋ก์ธ์ค๋ ๊ณ์ ์งํ
-
- elif category == 'FITTING' and confidence >= 0.5:
- try:
- fitting_info = classification_result
-
- fitting_insert_query = text("""
- INSERT INTO fitting_details (
- material_id, file_id, fitting_type, fitting_subtype,
- connection_method, connection_code, pressure_rating, max_pressure,
- manufacturing_method, material_standard, material_grade, material_type,
- main_size, reduced_size, classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :fitting_type, :fitting_subtype,
- :connection_method, :connection_code, :pressure_rating, :max_pressure,
- :manufacturing_method, :material_standard, :material_grade, :material_type,
- :main_size, :reduced_size, :classification_confidence, :additional_info
- )
- """)
-
- db.execute(fitting_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "fitting_type": fitting_info.get('fitting_type', {}).get('type', ''),
- "fitting_subtype": fitting_info.get('fitting_type', {}).get('subtype', ''),
- "connection_method": fitting_info.get('connection_method', {}).get('method', ''),
- "connection_code": fitting_info.get('connection_method', {}).get('matched_code', ''),
- "pressure_rating": fitting_info.get('pressure_rating', {}).get('rating', ''),
- "max_pressure": fitting_info.get('pressure_rating', {}).get('max_pressure', ''),
- "manufacturing_method": fitting_info.get('manufacturing', {}).get('method', ''),
- "material_standard": fitting_info.get('material', {}).get('standard', ''),
- "material_grade": fitting_info.get('material', {}).get('grade', ''),
- "material_type": fitting_info.get('material', {}).get('material_type', ''),
- "main_size": fitting_info.get('size_info', {}).get('main_size', ''),
- "reduced_size": fitting_info.get('size_info', {}).get('reduced_size', ''),
- "classification_confidence": confidence,
- "additional_info": json.dumps(fitting_info, ensure_ascii=False)
- })
-
- print(f"FITTING ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"FITTING ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- elif category == 'VALVE' and confidence >= 0.5:
- try:
- valve_info = classification_result
-
- valve_insert_query = text("""
- INSERT INTO valve_details (
- material_id, file_id, valve_type, valve_subtype, actuator_type,
- connection_method, pressure_rating, pressure_class,
- body_material, trim_material, size_inches,
- fire_safe, low_temp_service, classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :valve_type, :valve_subtype, :actuator_type,
- :connection_method, :pressure_rating, :pressure_class,
- :body_material, :trim_material, :size_inches,
- :fire_safe, :low_temp_service, :classification_confidence, :additional_info
- )
- """)
-
- db.execute(valve_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "valve_type": valve_info.get('valve_type', ''),
- "valve_subtype": valve_info.get('valve_subtype', ''),
- "actuator_type": valve_info.get('actuator_type', ''),
- "connection_method": valve_info.get('connection_method', ''),
- "pressure_rating": valve_info.get('pressure_rating', ''),
- "pressure_class": valve_info.get('pressure_class', ''),
- "body_material": valve_info.get('body_material', ''),
- "trim_material": valve_info.get('trim_material', ''),
- "size_inches": valve_info.get('size', ''),
- "fire_safe": valve_info.get('fire_safe', False),
- "low_temp_service": valve_info.get('low_temp_service', False),
- "classification_confidence": confidence,
- "additional_info": json.dumps(valve_info, ensure_ascii=False)
- })
-
- print(f"VALVE ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"VALVE ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- elif category == 'FLANGE' and confidence >= 0.5:
- try:
- flange_info = classification_result
-
- flange_insert_query = text("""
- INSERT INTO flange_details (
- material_id, file_id, flange_type, facing_type,
- pressure_rating, material_standard, material_grade,
- size_inches, classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :flange_type, :facing_type,
- :pressure_rating, :material_standard, :material_grade,
- :size_inches, :classification_confidence, :additional_info
- )
- """)
-
- db.execute(flange_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "flange_type": flange_info.get('flange_type', {}).get('type', ''),
- "facing_type": flange_info.get('face_finish', {}).get('finish', ''),
- "pressure_rating": flange_info.get('pressure_rating', {}).get('rating', ''),
- "material_standard": flange_info.get('material', {}).get('standard', ''),
- "material_grade": flange_info.get('material', {}).get('grade', ''),
- "size_inches": material_data.get('size_spec', ''),
- "classification_confidence": confidence,
- "additional_info": json.dumps(flange_info, ensure_ascii=False)
- })
-
- print(f"FLANGE ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"FLANGE ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- elif category == 'BOLT' and confidence >= 0.5:
- try:
- bolt_info = classification_result
-
- bolt_insert_query = text("""
- INSERT INTO bolt_details (
- material_id, file_id, bolt_type, thread_type,
- diameter, length, material_standard, material_grade,
- coating_type, includes_nut, includes_washer,
- classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :bolt_type, :thread_type,
- :diameter, :length, :material_standard, :material_grade,
- :coating_type, :includes_nut, :includes_washer,
- :classification_confidence, :additional_info
- )
- """)
-
- # BOLT ๋ถ๋ฅ๊ธฐ ๊ฒฐ๊ณผ ๊ตฌ์กฐ์ ๋ง๊ฒ ๋ฐ์ดํฐ ์ถ์ถ
- bolt_details = bolt_info.get('bolt_details', {})
- material_info = bolt_info.get('material', {})
-
- db.execute(bolt_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "bolt_type": bolt_details.get('type', ''),
- "thread_type": bolt_details.get('thread_type', ''),
- "diameter": bolt_details.get('diameter', ''),
- "length": bolt_details.get('length', ''),
- "material_standard": material_info.get('standard', ''),
- "material_grade": material_info.get('grade', ''),
- "coating_type": material_info.get('coating', ''),
- "includes_nut": bolt_details.get('includes_nut', False),
- "includes_washer": bolt_details.get('includes_washer', False),
- "classification_confidence": confidence,
- "additional_info": json.dumps(bolt_info, ensure_ascii=False)
- })
-
- print(f"BOLT ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"BOLT ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- elif category == 'GASKET' and confidence >= 0.5:
- try:
- gasket_info = classification_result
-
- gasket_insert_query = text("""
- INSERT INTO gasket_details (
- material_id, file_id, gasket_type, gasket_subtype,
- material_type, size_inches, pressure_rating,
- thickness, temperature_range, fire_safe,
- classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :gasket_type, :gasket_subtype,
- :material_type, :size_inches, :pressure_rating,
- :thickness, :temperature_range, :fire_safe,
- :classification_confidence, :additional_info
- )
- """)
-
- # GASKET ๋ถ๋ฅ๊ธฐ ๊ฒฐ๊ณผ ๊ตฌ์กฐ์ ๋ง๊ฒ ๋ฐ์ดํฐ ์ถ์ถ
- gasket_type_info = gasket_info.get('gasket_type', {})
- gasket_material_info = gasket_info.get('gasket_material', {})
- pressure_info = gasket_info.get('pressure_rating', {})
- size_info = gasket_info.get('size_info', {})
- temp_info = gasket_info.get('temperature_info', {})
-
- # SWG ์์ธ ์ ๋ณด ์ถ์ถ
- swg_details = gasket_material_info.get('swg_details', {})
- additional_info = {
- "swg_details": swg_details,
- "face_type": swg_details.get('face_type', ''),
- "construction": swg_details.get('detailed_construction', ''),
- "filler": swg_details.get('filler', ''),
- "outer_ring": swg_details.get('outer_ring', ''),
- "inner_ring": swg_details.get('inner_ring', '')
- }
-
- db.execute(gasket_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "gasket_type": gasket_type_info.get('type', ''),
- "gasket_subtype": gasket_type_info.get('subtype', ''),
- "material_type": gasket_material_info.get('material', ''),
- "size_inches": material_data.get('main_nom', '') or material_data.get('size_spec', ''),
- "pressure_rating": pressure_info.get('rating', ''),
- "thickness": swg_details.get('thickness', None),
- "temperature_range": temp_info.get('range', ''),
- "fire_safe": gasket_info.get('fire_safe', False),
- "classification_confidence": confidence,
- "additional_info": json.dumps(additional_info, ensure_ascii=False)
- })
-
- print(f"GASKET ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"GASKET ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- elif category == 'INSTRUMENT' and confidence >= 0.5:
- try:
- inst_info = classification_result
-
- inst_insert_query = text("""
- INSERT INTO instrument_details (
- material_id, file_id, instrument_type, instrument_subtype,
- measurement_type, measurement_range, accuracy,
- connection_type, connection_size, body_material,
- classification_confidence, additional_info
- )
- VALUES (
- (SELECT id FROM materials WHERE file_id = :file_id AND original_description = :description AND row_number = :row_number),
- :file_id, :instrument_type, :instrument_subtype,
- :measurement_type, :measurement_range, :accuracy,
- :connection_type, :connection_size, :body_material,
- :classification_confidence, :additional_info
- )
- """)
-
- # INSTRUMENT ๋ถ๋ฅ๊ธฐ ๊ฒฐ๊ณผ ๊ตฌ์กฐ์ ๋ง๊ฒ ๋ฐ์ดํฐ ์ถ์ถ
- inst_type_info = inst_info.get('instrument_type', {})
- measurement_info = inst_info.get('measurement', {})
- connection_info = inst_info.get('connection', {})
-
- db.execute(inst_insert_query, {
- "file_id": file_id,
- "description": material_data["original_description"],
- "row_number": material_data["row_number"],
- "instrument_type": inst_type_info.get('type', ''),
- "instrument_subtype": inst_type_info.get('subtype', ''),
- "measurement_type": measurement_info.get('type', ''),
- "measurement_range": measurement_info.get('range', ''),
- "accuracy": measurement_info.get('accuracy', ''),
- "connection_type": connection_info.get('type', ''),
- "connection_size": connection_info.get('size', ''),
- "body_material": inst_info.get('material', ''),
- "classification_confidence": confidence,
- "additional_info": json.dumps(inst_info, ensure_ascii=False)
- })
-
- print(f"INSTRUMENT ์์ธ์ ๋ณด ์ ์ฅ ์๋ฃ: {material_data['original_description']}")
-
- except Exception as e:
- print(f"INSTRUMENT ์์ธ์ ๋ณด ์ ์ฅ ์คํจ: {e}")
-
- materials_inserted += 1
-
- db.commit()
-
- return {
- "success": True,
- "message": f"์์ ํ DB ์ ์ฅ ์ฑ๊ณต! {materials_inserted}๊ฐ ์์ฌ ์ ์ฅ๋จ",
- "original_filename": file.filename,
- "file_id": file_id,
- "parsed_materials_count": parsed_count,
- "saved_materials_count": materials_inserted,
- "sample_materials": materials_data[:3] if materials_data else []
- }
-
- except Exception as e:
- db.rollback()
- if os.path.exists(file_path):
- os.remove(file_path)
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ฒ๋ฆฌ ์คํจ: {str(e)}")
-@router.get("/materials")
-async def get_materials(
- project_id: Optional[int] = None,
- file_id: Optional[int] = None,
- job_no: Optional[str] = None,
- filename: Optional[str] = None,
- revision: Optional[str] = None,
- skip: int = 0,
- limit: int = 100,
- search: Optional[str] = None,
- item_type: Optional[str] = None,
- material_grade: Optional[str] = None,
- size_spec: Optional[str] = None,
- file_filter: Optional[str] = None,
- sort_by: Optional[str] = None,
- db: Session = Depends(get_db)
-):
- """
- ์ ์ฅ๋ ์์ฌ ๋ชฉ๋ก ์กฐํ (job_no, filename, revision 3๊ฐ์ง๋ก ํํฐ๋ง ๊ฐ๋ฅ)
- """
- try:
- query = """
- SELECT m.id, m.file_id, m.original_description, m.quantity, m.unit,
- m.size_spec, m.main_nom, m.red_nom, m.material_grade, m.line_number, m.row_number,
- m.classified_category, m.classification_confidence, m.classification_details,
- m.created_at,
- f.original_filename, f.project_id, f.job_no, f.revision,
- p.official_project_code, p.project_name
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- LEFT JOIN projects p ON f.project_id = p.id
- WHERE 1=1
- """
- params = {}
- if project_id:
- query += " AND f.project_id = :project_id"
- params["project_id"] = project_id
- if file_id:
- query += " AND m.file_id = :file_id"
- params["file_id"] = file_id
- if job_no:
- query += " AND f.job_no = :job_no"
- params["job_no"] = job_no
- if filename:
- query += " AND f.original_filename = :filename"
- params["filename"] = filename
- if revision:
- query += " AND f.revision = :revision"
- params["revision"] = revision
- if search:
- query += " AND (m.original_description ILIKE :search OR m.material_grade ILIKE :search)"
- params["search"] = f"%{search}%"
- if item_type:
- query += " AND m.classified_category = :item_type"
- params["item_type"] = item_type
- if material_grade:
- query += " AND m.material_grade ILIKE :material_grade"
- params["material_grade"] = f"%{material_grade}%"
- if size_spec:
- query += " AND m.size_spec ILIKE :size_spec"
- params["size_spec"] = f"%{size_spec}%"
- if file_filter:
- query += " AND f.original_filename ILIKE :file_filter"
- params["file_filter"] = f"%{file_filter}%"
-
- # ์ ๋ ฌ ์ฒ๋ฆฌ
- if sort_by:
- if sort_by == "quantity_desc":
- query += " ORDER BY m.quantity DESC"
- elif sort_by == "quantity_asc":
- query += " ORDER BY m.quantity ASC"
- elif sort_by == "name_asc":
- query += " ORDER BY m.original_description ASC"
- elif sort_by == "name_desc":
- query += " ORDER BY m.original_description DESC"
- elif sort_by == "created_desc":
- query += " ORDER BY m.created_at DESC"
- elif sort_by == "created_asc":
- query += " ORDER BY m.created_at ASC"
- else:
- query += " ORDER BY m.line_number ASC"
- else:
- query += " ORDER BY m.line_number ASC"
-
- query += " LIMIT :limit OFFSET :skip"
- params["limit"] = limit
- params["skip"] = skip
-
- result = db.execute(text(query), params)
- materials = result.fetchall()
-
- # ์ ์ฒด ๊ฐ์ ์กฐํ
- count_query = """
- SELECT COUNT(*) as total
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- WHERE 1=1
- """
- count_params = {}
-
- if project_id:
- count_query += " AND f.project_id = :project_id"
- count_params["project_id"] = project_id
-
- if file_id:
- count_query += " AND m.file_id = :file_id"
- count_params["file_id"] = file_id
-
- if search:
- count_query += " AND (m.original_description ILIKE :search OR m.material_grade ILIKE :search)"
- count_params["search"] = f"%{search}%"
-
- if item_type:
- count_query += " AND m.classified_category = :item_type"
- count_params["item_type"] = item_type
-
- if material_grade:
- count_query += " AND m.material_grade ILIKE :material_grade"
- count_params["material_grade"] = f"%{material_grade}%"
-
- if size_spec:
- count_query += " AND m.size_spec ILIKE :size_spec"
- count_params["size_spec"] = f"%{size_spec}%"
-
- if file_filter:
- count_query += " AND f.original_filename ILIKE :file_filter"
- count_params["file_filter"] = f"%{file_filter}%"
-
- count_result = db.execute(text(count_query), count_params)
- total_count = count_result.fetchone()[0]
-
- return {
- "success": True,
- "total_count": total_count,
- "returned_count": len(materials),
- "skip": skip,
- "limit": limit,
- "materials": [
- {
- "id": m.id,
- "file_id": m.file_id,
- "filename": m.original_filename,
- "project_id": m.project_id,
- "project_code": m.official_project_code,
- "project_name": m.project_name,
- "original_description": m.original_description,
- "quantity": float(m.quantity) if m.quantity else 0,
- "unit": m.unit,
- "size_spec": m.size_spec,
- "material_grade": m.material_grade,
- "line_number": m.line_number,
- "row_number": m.row_number,
- "classified_category": m.classified_category,
- "classification_confidence": float(m.classification_confidence) if m.classification_confidence else 0,
- "classification_details": json.loads(m.classification_details) if m.classification_details else None,
- "created_at": m.created_at
- }
- for m in materials
- ]
- }
-
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"์์ฌ ์กฐํ ์คํจ: {str(e)}")
-
-@router.get("/materials/summary")
-async def get_materials_summary(
- project_id: Optional[int] = None,
- file_id: Optional[int] = None,
- db: Session = Depends(get_db)
-):
- """์์ฌ ์์ฝ ํต๊ณ"""
- try:
- query = """
- SELECT
- COUNT(*) as total_items,
- COUNT(DISTINCT m.original_description) as unique_descriptions,
- COUNT(DISTINCT m.size_spec) as unique_sizes,
- COUNT(DISTINCT m.material_grade) as unique_materials,
- SUM(m.quantity) as total_quantity,
- AVG(m.quantity) as avg_quantity,
- MIN(m.created_at) as earliest_upload,
- MAX(m.created_at) as latest_upload
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- WHERE 1=1
- """
-
- params = {}
-
- if project_id:
- query += " AND f.project_id = :project_id"
- params["project_id"] = project_id
-
- if file_id:
- query += " AND m.file_id = :file_id"
- params["file_id"] = file_id
-
- result = db.execute(text(query), params)
- summary = result.fetchone()
-
- return {
- "success": True,
- "summary": {
- "total_items": summary.total_items,
- "unique_descriptions": summary.unique_descriptions,
- "unique_sizes": summary.unique_sizes,
- "unique_materials": summary.unique_materials,
- "total_quantity": float(summary.total_quantity) if summary.total_quantity else 0,
- "avg_quantity": round(float(summary.avg_quantity), 2) if summary.avg_quantity else 0,
- "earliest_upload": summary.earliest_upload,
- "latest_upload": summary.latest_upload
- }
- }
-
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"์์ฝ ์กฐํ ์คํจ: {str(e)}")
-
-@router.get("/materials/compare-revisions")
-async def compare_revisions(
- job_no: str,
- filename: str,
- old_revision: str,
- new_revision: str,
- db: Session = Depends(get_db)
-):
- """
- ๋ฆฌ๋น์ ๊ฐ ์์ฌ ๋น๊ต
- """
- try:
- # ๊ธฐ์กด ๋ฆฌ๋น์ ์์ฌ ์กฐํ
- old_materials_query = text("""
- SELECT m.original_description, m.quantity, m.unit, m.size_spec,
- m.material_grade, m.classified_category, m.classification_confidence
- FROM materials m
- JOIN files f ON m.file_id = f.id
- WHERE f.job_no = :job_no
- AND f.original_filename = :filename
- AND f.revision = :old_revision
- """)
-
- old_result = db.execute(old_materials_query, {
- "job_no": job_no,
- "filename": filename,
- "old_revision": old_revision
- })
- old_materials = old_result.fetchall()
-
- # ์ ๋ฆฌ๋น์ ์์ฌ ์กฐํ
- new_materials_query = text("""
- SELECT m.original_description, m.quantity, m.unit, m.size_spec,
- m.material_grade, m.classified_category, m.classification_confidence
- FROM materials m
- JOIN files f ON m.file_id = f.id
- WHERE f.job_no = :job_no
- AND f.original_filename = :filename
- AND f.revision = :new_revision
- """)
-
- new_result = db.execute(new_materials_query, {
- "job_no": job_no,
- "filename": filename,
- "new_revision": new_revision
- })
- new_materials = new_result.fetchall()
-
- # ์์ฌ ํค ์์ฑ ํจ์
- def create_material_key(material):
- return f"{material.original_description}_{material.size_spec}_{material.material_grade}"
-
- # ๊ธฐ์กด ์์ฌ๋ฅผ ๋์
๋๋ฆฌ๋ก ๋ณํ
- old_materials_dict = {}
- for material in old_materials:
- key = create_material_key(material)
- old_materials_dict[key] = {
- "original_description": material.original_description,
- "quantity": float(material.quantity) if material.quantity else 0,
- "unit": material.unit,
- "size_spec": material.size_spec,
- "material_grade": material.material_grade,
- "classified_category": material.classified_category,
- "classification_confidence": material.classification_confidence
- }
-
- # ์ ์์ฌ๋ฅผ ๋์
๋๋ฆฌ๋ก ๋ณํ
- new_materials_dict = {}
- for material in new_materials:
- key = create_material_key(material)
- new_materials_dict[key] = {
- "original_description": material.original_description,
- "quantity": float(material.quantity) if material.quantity else 0,
- "unit": material.unit,
- "size_spec": material.size_spec,
- "material_grade": material.material_grade,
- "classified_category": material.classified_category,
- "classification_confidence": material.classification_confidence
- }
-
- # ๋ณ๊ฒฝ ์ฌํญ ๋ถ์
- all_keys = set(old_materials_dict.keys()) | set(new_materials_dict.keys())
-
- added_items = []
- removed_items = []
- changed_items = []
-
- for key in all_keys:
- old_item = old_materials_dict.get(key)
- new_item = new_materials_dict.get(key)
-
- if old_item and not new_item:
- # ์ญ์ ๋ ํญ๋ชฉ
- removed_items.append({
- "key": key,
- "item": old_item,
- "change_type": "removed"
- })
- elif not old_item and new_item:
- # ์ถ๊ฐ๋ ํญ๋ชฉ
- added_items.append({
- "key": key,
- "item": new_item,
- "change_type": "added"
- })
- elif old_item and new_item:
- # ์๋ ๋ณ๊ฒฝ ํ์ธ
- if old_item["quantity"] != new_item["quantity"]:
- changed_items.append({
- "key": key,
- "old_item": old_item,
- "new_item": new_item,
- "quantity_change": new_item["quantity"] - old_item["quantity"],
- "change_type": "quantity_changed"
- })
-
- # ๋ถ๋ฅ๋ณ ํต๊ณ
- def calculate_category_stats(items):
- stats = {}
- for item in items:
- category = item.get("item", {}).get("classified_category", "OTHER")
- if category not in stats:
- stats[category] = {"count": 0, "total_quantity": 0}
- stats[category]["count"] += 1
- stats[category]["total_quantity"] += item.get("item", {}).get("quantity", 0)
- return stats
-
- added_stats = calculate_category_stats(added_items)
- removed_stats = calculate_category_stats(removed_items)
- changed_stats = calculate_category_stats(changed_items)
-
- return {
- "success": True,
- "comparison": {
- "old_revision": old_revision,
- "new_revision": new_revision,
- "filename": filename,
- "job_no": job_no,
- "summary": {
- "added_count": len(added_items),
- "removed_count": len(removed_items),
- "changed_count": len(changed_items),
- "total_changes": len(added_items) + len(removed_items) + len(changed_items)
- },
- "changes": {
- "added": added_items,
- "removed": removed_items,
- "changed": changed_items
- },
- "category_stats": {
- "added": added_stats,
- "removed": removed_stats,
- "changed": changed_stats
- }
- }
- }
-
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"๋ฆฌ๋น์ ๋น๊ต ์คํจ: {str(e)}")
-
-@router.post("/materials/update-classification-details")
-async def update_classification_details(
- file_id: Optional[int] = None,
- db: Session = Depends(get_db)
-):
- """๊ธฐ์กด ์์ฌ๋ค์ classification_details ์
๋ฐ์ดํธ"""
- try:
- # ์
๋ฐ์ดํธํ ์์ฌ๋ค ์กฐํ
- query = """
- SELECT id, original_description, size_spec, classified_category
- FROM materials
- WHERE classification_details IS NULL OR classification_details = '{}'
- """
- params = {}
- if file_id:
- query += " AND file_id = :file_id"
- params["file_id"] = file_id
-
- query += " ORDER BY id"
- result = db.execute(text(query), params)
- materials = result.fetchall()
-
- if not materials:
- return {
- "success": True,
- "message": "์
๋ฐ์ดํธํ ์์ฌ๊ฐ ์์ต๋๋ค.",
- "updated_count": 0
- }
-
- updated_count = 0
- for material in materials:
- material_id = material.id
- description = material.original_description
- size_spec = material.size_spec
- category = material.classified_category
-
- print(f"์์ฌ {material_id} ์ฌ๋ถ๋ฅ ์ค: {description}")
-
- # ์นดํ
๊ณ ๋ฆฌ๋ณ๋ก ์ ์ ํ ๋ถ๋ฅ๊ธฐ ํธ์ถ
- classification_result = None
-
- if category == 'PIPE':
- classification_result = classify_pipe("", description, size_spec, 0.0)
- elif category == 'FITTING':
- classification_result = classify_fitting("", description, size_spec)
- elif category == 'VALVE':
- classification_result = classify_valve("", description, size_spec)
- elif category == 'FLANGE':
- classification_result = classify_flange("", description, size_spec)
- elif category == 'BOLT':
- classification_result = classify_bolt("", description, size_spec)
- elif category == 'GASKET':
- classification_result = classify_gasket("", description, size_spec)
- elif category == 'INSTRUMENT':
- classification_result = classify_instrument("", description, size_spec)
- else:
- # ์นดํ
๊ณ ๋ฆฌ๊ฐ ์์ผ๋ฉด ๋ชจ๋ ๋ถ๋ฅ๊ธฐ ์๋
- classification_result = classify_pipe("", description, size_spec, 0.0)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_fitting("", description, size_spec)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_valve("", description, size_spec)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_flange("", description, size_spec)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_bolt("", description, size_spec)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_gasket("", description, size_spec)
- if classification_result.get("overall_confidence", 0) < 0.5:
- classification_result = classify_instrument("", description, size_spec)
-
- if classification_result:
- # classification_details๋ฅผ JSON์ผ๋ก ์ง๋ ฌํ
- classification_details = json.dumps(classification_result, ensure_ascii=False)
-
- # DB ์
๋ฐ์ดํธ
- update_query = text("""
- UPDATE materials
- SET classification_details = :classification_details,
- updated_at = NOW()
- WHERE id = :material_id
- """)
-
- db.execute(update_query, {
- "material_id": material_id,
- "classification_details": classification_details
- })
-
- updated_count += 1
- print(f"์์ฌ {material_id} ์
๋ฐ์ดํธ ์๋ฃ")
-
- db.commit()
-
- return {
- "success": True,
- "message": f"{updated_count}๊ฐ ์์ฌ์ ๋ถ๋ฅ ์์ธ์ ๋ณด๊ฐ ์
๋ฐ์ดํธ๋์์ต๋๋ค.",
- "updated_count": updated_count,
- "total_materials": len(materials)
- }
-
- except Exception as e:
- db.rollback()
- raise HTTPException(status_code=500, detail=f"๋ถ๋ฅ ์์ธ์ ๋ณด ์
๋ฐ์ดํธ ์คํจ: {str(e)}")
diff --git a/backend/app/auth/__init__.py b/backend/app/auth/__init__.py
index 915a424..c7dbd82 100644
--- a/backend/app/auth/__init__.py
+++ b/backend/app/auth/__init__.py
@@ -61,3 +61,19 @@ __all__ = [
'RolePermission',
'UserRepository'
]
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/app/auth/auth_controller.py b/backend/app/auth/auth_controller.py
index a50c916..d6347e7 100644
--- a/backend/app/auth/auth_controller.py
+++ b/backend/app/auth/auth_controller.py
@@ -391,3 +391,19 @@ async def delete_user(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="์ฌ์ฉ์ ์ญ์ ์ค ์ค๋ฅ๊ฐ ๋ฐ์ํ์ต๋๋ค"
)
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/app/auth/jwt_service.py b/backend/app/auth/jwt_service.py
index a975ead..c2d5165 100644
--- a/backend/app/auth/jwt_service.py
+++ b/backend/app/auth/jwt_service.py
@@ -249,3 +249,19 @@ class JWTService:
# JWT ์๋น์ค ์ธ์คํด์ค
jwt_service = JWTService()
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/app/auth/middleware.py b/backend/app/auth/middleware.py
index 3389ea7..3d00ad2 100644
--- a/backend/app/auth/middleware.py
+++ b/backend/app/auth/middleware.py
@@ -303,3 +303,19 @@ async def get_current_user_optional(
except Exception as e:
logger.debug(f"Optional auth failed: {str(e)}")
return None
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/app/auth/models.py b/backend/app/auth/models.py
index 5795364..a11b42a 100644
--- a/backend/app/auth/models.py
+++ b/backend/app/auth/models.py
@@ -352,3 +352,19 @@ class UserRepository:
self.db.rollback()
logger.error(f"Failed to deactivate sessions for user_id {user_id}: {str(e)}")
raise
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/app/config.py b/backend/app/config.py
index 6a19fbc..37e1f37 100644
--- a/backend/app/config.py
+++ b/backend/app/config.py
@@ -205,8 +205,10 @@ class Settings(BaseSettings):
"development": [
"http://localhost:3000",
"http://localhost:5173",
+ "http://localhost:13000",
"http://127.0.0.1:3000",
- "http://127.0.0.1:5173"
+ "http://127.0.0.1:5173",
+ "http://127.0.0.1:13000"
],
"production": [
"https://your-domain.com",
diff --git a/backend/app/main.py b/backend/app/main.py
index 5cf6d5a..af5cfab 100644
--- a/backend/app/main.py
+++ b/backend/app/main.py
@@ -18,7 +18,7 @@ settings = get_settings()
# ๋ก๊ฑฐ ์ค์
logger = get_logger(__name__)
-# FastAPI ์ฑ ์์ฑ
+# FastAPI ์ฑ ์์ฑ (์์ฒญ ํฌ๊ธฐ ์ ํ ์ฆ๊ฐ)
app = FastAPI(
title=settings.app_name,
description="์์ฌ ๋ถ๋ฅ ๋ฐ ํ๋ก์ ํธ ๊ด๋ฆฌ ์์คํ
",
@@ -26,6 +26,27 @@ app = FastAPI(
debug=settings.debug
)
+# ์์ฒญ ํฌ๊ธฐ ์ ํ ์ค์ (100MB๋ก ์ฆ๊ฐ)
+from fastapi.middleware.trustedhost import TrustedHostMiddleware
+from starlette.middleware.base import BaseHTTPMiddleware
+from starlette.requests import Request
+from starlette.responses import Response
+
+class RequestSizeLimitMiddleware(BaseHTTPMiddleware):
+ def __init__(self, app, max_request_size: int = 100 * 1024 * 1024): # 100MB
+ super().__init__(app)
+ self.max_request_size = max_request_size
+
+ async def dispatch(self, request: Request, call_next):
+ if "content-length" in request.headers:
+ content_length = int(request.headers["content-length"])
+ if content_length > self.max_request_size:
+ return Response("Request Entity Too Large", status_code=413)
+ return await call_next(request)
+
+# ์์ฒญ ํฌ๊ธฐ ์ ํ ๋ฏธ๋ค์จ์ด ์ถ๊ฐ
+app.add_middleware(RequestSizeLimitMiddleware, max_request_size=100 * 1024 * 1024)
+
# ์๋ฌ ํธ๋ค๋ฌ ์ค์
setup_error_handlers(app)
@@ -38,10 +59,11 @@ app.add_middleware(
logger.info(f"CORS origins configured for {settings.environment}: {settings.security.cors_origins}")
-# ๋ผ์ฐํฐ๋ค import ๋ฐ ๋ฑ๋ก
+# ๋ผ์ฐํฐ๋ค import ๋ฐ ๋ฑ๋ก - files ๋ผ์ฐํฐ๋ฅผ ์ต์ฐ์ ์ผ๋ก ๋ฑ๋ก
try:
from .routers import files
app.include_router(files.router, prefix="/files", tags=["files"])
+ logger.info("FILES ๋ผ์ฐํฐ ๋ฑ๋ก ์๋ฃ - ์ต์ฐ์ ")
except ImportError:
logger.warning("files ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
@@ -63,19 +85,26 @@ try:
except ImportError:
logger.warning("material_comparison ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
+try:
+ from .routers import dashboard
+ app.include_router(dashboard.router, tags=["dashboard"])
+except ImportError:
+ logger.warning("dashboard ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
+
try:
from .routers import tubing
app.include_router(tubing.router, prefix="/tubing", tags=["tubing"])
except ImportError:
logger.warning("tubing ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
-# ํ์ผ ๊ด๋ฆฌ API ๋ผ์ฐํฐ ๋ฑ๋ก
-try:
- from .api import file_management
- app.include_router(file_management.router, tags=["file-management"])
- logger.info("ํ์ผ ๊ด๋ฆฌ API ๋ผ์ฐํฐ ๋ฑ๋ก ์๋ฃ")
-except ImportError as e:
- logger.warning(f"ํ์ผ ๊ด๋ฆฌ ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค: {e}")
+# ํ์ผ ๊ด๋ฆฌ API ๋ผ์ฐํฐ ๋ฑ๋ก (๋นํ์ฑํ - files ๋ผ์ฐํฐ์ ์ถฉ๋ ๋ฐฉ์ง)
+# try:
+# from .api import file_management
+# app.include_router(file_management.router, tags=["file-management"])
+# logger.info("ํ์ผ ๊ด๋ฆฌ API ๋ผ์ฐํฐ ๋ฑ๋ก ์๋ฃ")
+# except ImportError as e:
+# logger.warning(f"ํ์ผ ๊ด๋ฆฌ ๋ผ์ฐํฐ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค: {e}")
+logger.info("ํ์ผ ๊ด๋ฆฌ API ๋ผ์ฐํฐ ๋นํ์ฑํ๋จ (files ๋ผ์ฐํฐ ์ฌ์ฉ)")
# ์ธ์ฆ API ๋ผ์ฐํฐ ๋ฑ๋ก
try:
diff --git a/backend/app/routers/dashboard.py b/backend/app/routers/dashboard.py
new file mode 100644
index 0000000..4a4abb8
--- /dev/null
+++ b/backend/app/routers/dashboard.py
@@ -0,0 +1,427 @@
+"""
+๋์๋ณด๋ API
+์ฌ์ฉ์๋ณ ๋ง์ถคํ ๋์๋ณด๋ ๋ฐ์ดํฐ ์ ๊ณต
+"""
+
+from fastapi import APIRouter, Depends, HTTPException, Query
+from sqlalchemy.orm import Session
+from sqlalchemy import text, func
+from typing import Optional, Dict, Any, List
+from datetime import datetime, timedelta
+
+from ..database import get_db
+from ..auth.middleware import get_current_user
+from ..services.activity_logger import ActivityLogger
+from ..utils.logger import get_logger
+
+logger = get_logger(__name__)
+router = APIRouter(prefix="/dashboard", tags=["dashboard"])
+
+
+@router.get("/stats")
+async def get_dashboard_stats(
+ current_user: dict = Depends(get_current_user),
+ db: Session = Depends(get_db)
+):
+ """
+ ์ฌ์ฉ์๋ณ ๋ง์ถคํ ๋์๋ณด๋ ํต๊ณ ๋ฐ์ดํฐ ์กฐํ
+
+ Returns:
+ dict: ์ฌ์ฉ์ ์ญํ ์ ๋ง๋ ํต๊ณ ๋ฐ์ดํฐ
+ """
+ try:
+ username = current_user.get('username')
+ user_role = current_user.get('role', 'user')
+
+ # ์ญํ ๋ณ ๋ง์ถค ํต๊ณ ์์ฑ
+ if user_role == 'admin':
+ stats = await get_admin_stats(db)
+ elif user_role == 'manager':
+ stats = await get_manager_stats(db, username)
+ elif user_role == 'designer':
+ stats = await get_designer_stats(db, username)
+ elif user_role == 'purchaser':
+ stats = await get_purchaser_stats(db, username)
+ else:
+ stats = await get_user_stats(db, username)
+
+ return {
+ "success": True,
+ "user_role": user_role,
+ "stats": stats
+ }
+
+ except Exception as e:
+ logger.error(f"Dashboard stats error: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"๋์๋ณด๋ ํต๊ณ ์กฐํ ์คํจ: {str(e)}")
+
+
+@router.get("/activities")
+async def get_user_activities(
+ current_user: dict = Depends(get_current_user),
+ limit: int = Query(10, ge=1, le=50),
+ db: Session = Depends(get_db)
+):
+ """
+ ์ฌ์ฉ์ ํ๋ ์ด๋ ฅ ์กฐํ
+
+ Args:
+ limit: ์กฐํํ ํ๋ ์ (1-50)
+
+ Returns:
+ dict: ์ฌ์ฉ์ ํ๋ ์ด๋ ฅ
+ """
+ try:
+ username = current_user.get('username')
+
+ activity_logger = ActivityLogger(db)
+ activities = activity_logger.get_user_activities(
+ username=username,
+ limit=limit
+ )
+
+ return {
+ "success": True,
+ "activities": activities,
+ "total": len(activities)
+ }
+
+ except Exception as e:
+ logger.error(f"User activities error: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"ํ๋ ์ด๋ ฅ ์กฐํ ์คํจ: {str(e)}")
+
+
+@router.get("/recent-activities")
+async def get_recent_activities(
+ current_user: dict = Depends(get_current_user),
+ days: int = Query(7, ge=1, le=30),
+ limit: int = Query(20, ge=1, le=100),
+ db: Session = Depends(get_db)
+):
+ """
+ ์ต๊ทผ ์ ์ฒด ํ๋ ์กฐํ (๊ด๋ฆฌ์/๋งค๋์ ์ฉ)
+
+ Args:
+ days: ์กฐํ ๊ธฐ๊ฐ (์ผ)
+ limit: ์กฐํํ ํ๋ ์
+
+ Returns:
+ dict: ์ต๊ทผ ํ๋ ์ด๋ ฅ
+ """
+ try:
+ user_role = current_user.get('role', 'user')
+
+ # ๊ด๋ฆฌ์์ ๋งค๋์ ๋ง ์ ์ฒด ํ๋ ์กฐํ ๊ฐ๋ฅ
+ if user_role not in ['admin', 'manager']:
+ raise HTTPException(status_code=403, detail="๊ถํ์ด ์์ต๋๋ค")
+
+ activity_logger = ActivityLogger(db)
+ activities = activity_logger.get_recent_activities(
+ days=days,
+ limit=limit
+ )
+
+ return {
+ "success": True,
+ "activities": activities,
+ "period_days": days,
+ "total": len(activities)
+ }
+
+ except HTTPException:
+ raise
+ except Exception as e:
+ logger.error(f"Recent activities error: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"์ต๊ทผ ํ๋ ์กฐํ ์คํจ: {str(e)}")
+
+
+async def get_admin_stats(db: Session) -> Dict[str, Any]:
+ """๊ด๋ฆฌ์์ฉ ํต๊ณ"""
+ try:
+ # ์ ์ฒด ํ๋ก์ ํธ ์
+ total_projects_query = text("SELECT COUNT(*) FROM jobs WHERE status != 'deleted'")
+ total_projects = db.execute(total_projects_query).scalar()
+
+ # ํ์ฑ ์ฌ์ฉ์ ์ (์ต๊ทผ 30์ผ ๋ก๊ทธ์ธ)
+ active_users_query = text("""
+ SELECT COUNT(DISTINCT username)
+ FROM user_activity_logs
+ WHERE created_at >= CURRENT_TIMESTAMP - INTERVAL '30 days'
+ """)
+ active_users = db.execute(active_users_query).scalar() or 0
+
+ # ์ค๋ ์
๋ก๋๋ ํ์ผ ์
+ today_uploads_query = text("""
+ SELECT COUNT(*)
+ FROM files
+ WHERE DATE(upload_date) = CURRENT_DATE
+ """)
+ today_uploads = db.execute(today_uploads_query).scalar() or 0
+
+ # ์ ์ฒด ์์ฌ ์
+ total_materials_query = text("SELECT COUNT(*) FROM materials")
+ total_materials = db.execute(total_materials_query).scalar() or 0
+
+ return {
+ "title": "์์คํ
๊ด๋ฆฌ์",
+ "subtitle": "์ ์ฒด ์์คํ
์ ๊ด๋ฆฌํ๊ณ ๋ชจ๋ํฐ๋งํฉ๋๋ค",
+ "metrics": [
+ {"label": "์ ์ฒด ํ๋ก์ ํธ ์", "value": total_projects, "icon": "๐", "color": "#667eea"},
+ {"label": "ํ์ฑ ์ฌ์ฉ์ ์", "value": active_users, "icon": "๐ฅ", "color": "#48bb78"},
+ {"label": "์์คํ
์ํ", "value": "์ ์", "icon": "๐ข", "color": "#38b2ac"},
+ {"label": "์ค๋ ์
๋ก๋", "value": today_uploads, "icon": "๐ค", "color": "#ed8936"}
+ ]
+ }
+
+ except Exception as e:
+ logger.error(f"Admin stats error: {str(e)}")
+ raise
+
+
+async def get_manager_stats(db: Session, username: str) -> Dict[str, Any]:
+ """๋งค๋์ ์ฉ ํต๊ณ"""
+ try:
+ # ๋ด๋น ํ๋ก์ ํธ ์ (ํฅํ assigned_to ํ๋ ํ์ฉ)
+ assigned_projects_query = text("""
+ SELECT COUNT(*)
+ FROM jobs
+ WHERE (assigned_to = :username OR created_by = :username)
+ AND status != 'deleted'
+ """)
+ assigned_projects = db.execute(assigned_projects_query, {"username": username}).scalar() or 0
+
+ # ์ด๋ฒ ์ฃผ ์๋ฃ๋ ์์
(ํ๋ ๋ก๊ทธ ๊ธฐ๋ฐ)
+ week_completed_query = text("""
+ SELECT COUNT(*)
+ FROM user_activity_logs
+ WHERE activity_type IN ('PROJECT_CREATE', 'PURCHASE_CONFIRM')
+ AND created_at >= CURRENT_TIMESTAMP - INTERVAL '7 days'
+ """)
+ week_completed = db.execute(week_completed_query).scalar() or 0
+
+ # ์น์ธ ๋๊ธฐ (๊ตฌ๋งค ํ์ ๋๊ธฐ ๋ฑ)
+ pending_approvals_query = text("""
+ SELECT COUNT(*)
+ FROM material_purchase_tracking
+ WHERE purchase_status = 'PENDING'
+ OR purchase_status = 'REQUESTED'
+ """)
+ pending_approvals = db.execute(pending_approvals_query).scalar() or 0
+
+ return {
+ "title": "ํ๋ก์ ํธ ๋งค๋์ ",
+ "subtitle": "ํ ํ๋ก์ ํธ๋ฅผ ๊ด๋ฆฌํ๊ณ ์งํ์ํฉ์ ๋ชจ๋ํฐ๋งํฉ๋๋ค",
+ "metrics": [
+ {"label": "๋ด๋น ํ๋ก์ ํธ", "value": assigned_projects, "icon": "๐", "color": "#667eea"},
+ {"label": "ํ ์งํ๋ฅ ", "value": "87%", "icon": "๐", "color": "#48bb78"},
+ {"label": "์น์ธ ๋๊ธฐ", "value": pending_approvals, "icon": "โณ", "color": "#ed8936"},
+ {"label": "์ด๋ฒ ์ฃผ ์๋ฃ", "value": week_completed, "icon": "โ
", "color": "#38b2ac"}
+ ]
+ }
+
+ except Exception as e:
+ logger.error(f"Manager stats error: {str(e)}")
+ raise
+
+
+async def get_designer_stats(db: Session, username: str) -> Dict[str, Any]:
+ """์ค๊ณ์์ฉ ํต๊ณ"""
+ try:
+ # ๋ด๊ฐ ์
๋ก๋ํ BOM ํ์ผ ์
+ my_files_query = text("""
+ SELECT COUNT(*)
+ FROM files
+ WHERE uploaded_by = :username
+ AND is_active = true
+ """)
+ my_files = db.execute(my_files_query, {"username": username}).scalar() or 0
+
+ # ๋ถ๋ฅ๋ ์์ฌ ์
+ classified_materials_query = text("""
+ SELECT COUNT(*)
+ FROM materials m
+ JOIN files f ON m.file_id = f.id
+ WHERE f.uploaded_by = :username
+ AND m.classified_category IS NOT NULL
+ """)
+ classified_materials = db.execute(classified_materials_query, {"username": username}).scalar() or 0
+
+ # ๊ฒ์ฆ ๋๊ธฐ ์์ฌ ์
+ pending_verification_query = text("""
+ SELECT COUNT(*)
+ FROM materials m
+ JOIN files f ON m.file_id = f.id
+ WHERE f.uploaded_by = :username
+ AND m.is_verified = false
+ """)
+ pending_verification = db.execute(pending_verification_query, {"username": username}).scalar() or 0
+
+ # ์ด๋ฒ ์ฃผ ์
๋ก๋ ์
+ week_uploads_query = text("""
+ SELECT COUNT(*)
+ FROM files
+ WHERE uploaded_by = :username
+ AND upload_date >= CURRENT_TIMESTAMP - INTERVAL '7 days'
+ """)
+ week_uploads = db.execute(week_uploads_query, {"username": username}).scalar() or 0
+
+ # ๋ถ๋ฅ ์๋ฃ์จ ๊ณ์ฐ
+ total_materials_query = text("""
+ SELECT COUNT(*)
+ FROM materials m
+ JOIN files f ON m.file_id = f.id
+ WHERE f.uploaded_by = :username
+ """)
+ total_materials = db.execute(total_materials_query, {"username": username}).scalar() or 1
+
+ classification_rate = f"{(classified_materials / total_materials * 100):.0f}%" if total_materials > 0 else "0%"
+
+ return {
+ "title": "์ค๊ณ ๋ด๋น์",
+ "subtitle": "BOM ํ์ผ์ ๊ด๋ฆฌํ๊ณ ์์ฌ๋ฅผ ๋ถ๋ฅํฉ๋๋ค",
+ "metrics": [
+ {"label": "๋ด BOM ํ์ผ", "value": my_files, "icon": "๐", "color": "#667eea"},
+ {"label": "๋ถ๋ฅ ์๋ฃ์จ", "value": classification_rate, "icon": "๐ฏ", "color": "#48bb78"},
+ {"label": "๊ฒ์ฆ ๋๊ธฐ", "value": pending_verification, "icon": "โณ", "color": "#ed8936"},
+ {"label": "์ด๋ฒ ์ฃผ ์
๋ก๋", "value": week_uploads, "icon": "๐ค", "color": "#9f7aea"}
+ ]
+ }
+
+ except Exception as e:
+ logger.error(f"Designer stats error: {str(e)}")
+ raise
+
+
+async def get_purchaser_stats(db: Session, username: str) -> Dict[str, Any]:
+ """๊ตฌ๋งค์์ฉ ํต๊ณ"""
+ try:
+ # ๊ตฌ๋งค ์์ฒญ ์
+ purchase_requests_query = text("""
+ SELECT COUNT(*)
+ FROM material_purchase_tracking
+ WHERE purchase_status IN ('PENDING', 'REQUESTED')
+ """)
+ purchase_requests = db.execute(purchase_requests_query).scalar() or 0
+
+ # ๋ฐ์ฃผ ์๋ฃ ์
+ orders_completed_query = text("""
+ SELECT COUNT(*)
+ FROM material_purchase_tracking
+ WHERE purchase_status = 'CONFIRMED'
+ AND confirmed_by = :username
+ """)
+ orders_completed = db.execute(orders_completed_query, {"username": username}).scalar() or 0
+
+ # ์
๊ณ ๋๊ธฐ ์
+ receiving_pending_query = text("""
+ SELECT COUNT(*)
+ FROM material_purchase_tracking
+ WHERE purchase_status = 'ORDERED'
+ """)
+ receiving_pending = db.execute(receiving_pending_query).scalar() or 0
+
+ # ์ด๋ฒ ๋ฌ ๊ตฌ๋งค ๊ธ์ก (์์ ๋ฐ์ดํฐ)
+ monthly_amount = "โฉ2.3M" # ์ค์ ๋ก๋ ๊ณ์ฐ ํ์
+
+ return {
+ "title": "๊ตฌ๋งค ๋ด๋น์",
+ "subtitle": "๊ตฌ๋งค ์์ฒญ์ ์ฒ๋ฆฌํ๊ณ ๋ฐ์ฃผ๋ฅผ ๊ด๋ฆฌํฉ๋๋ค",
+ "metrics": [
+ {"label": "๊ตฌ๋งค ์์ฒญ", "value": purchase_requests, "icon": "๐", "color": "#667eea"},
+ {"label": "๋ฐ์ฃผ ์๋ฃ", "value": orders_completed, "icon": "โ
", "color": "#48bb78"},
+ {"label": "์
๊ณ ๋๊ธฐ", "value": receiving_pending, "icon": "๐ฆ", "color": "#ed8936"},
+ {"label": "์ด๋ฒ ๋ฌ ๊ธ์ก", "value": monthly_amount, "icon": "๐ฐ", "color": "#9f7aea"}
+ ]
+ }
+
+ except Exception as e:
+ logger.error(f"Purchaser stats error: {str(e)}")
+ raise
+
+
+async def get_user_stats(db: Session, username: str) -> Dict[str, Any]:
+ """์ผ๋ฐ ์ฌ์ฉ์์ฉ ํต๊ณ"""
+ try:
+ # ๋ด ํ๋ ์ (์ต๊ทผ 7์ผ)
+ my_activities_query = text("""
+ SELECT COUNT(*)
+ FROM user_activity_logs
+ WHERE username = :username
+ AND created_at >= CURRENT_TIMESTAMP - INTERVAL '7 days'
+ """)
+ my_activities = db.execute(my_activities_query, {"username": username}).scalar() or 0
+
+ # ์ ๊ทผ ๊ฐ๋ฅํ ํ๋ก์ ํธ ์ (์์)
+ accessible_projects = 5
+
+ return {
+ "title": "์ผ๋ฐ ์ฌ์ฉ์",
+ "subtitle": "ํ ๋น๋ ์
๋ฌด๋ฅผ ์ํํ๊ณ ํ๋ก์ ํธ์ ์ฐธ์ฌํฉ๋๋ค",
+ "metrics": [
+ {"label": "๋ด ์
๋ฌด", "value": 6, "icon": "๐", "color": "#667eea"},
+ {"label": "์๋ฃ์จ", "value": "75%", "icon": "๐", "color": "#48bb78"},
+ {"label": "๋๊ธฐ ์ค", "value": 2, "icon": "โณ", "color": "#ed8936"},
+ {"label": "์ด๋ฒ ์ฃผ ํ๋", "value": my_activities, "icon": "๐ฏ", "color": "#9f7aea"}
+ ]
+ }
+
+ except Exception as e:
+ logger.error(f"User stats error: {str(e)}")
+ raise
+
+
+@router.get("/quick-actions")
+async def get_quick_actions(
+ current_user: dict = Depends(get_current_user)
+):
+ """
+ ์ฌ์ฉ์ ์ญํ ๋ณ ๋น ๋ฅธ ์์
๋ฉ๋ด ์กฐํ
+
+ Returns:
+ dict: ์ญํ ๋ณ ๋น ๋ฅธ ์์
๋ชฉ๋ก
+ """
+ try:
+ user_role = current_user.get('role', 'user')
+
+ quick_actions = {
+ "admin": [
+ {"title": "์ฌ์ฉ์ ๊ด๋ฆฌ", "icon": "๐ค", "path": "/admin/users", "color": "#667eea"},
+ {"title": "์์คํ
์ค์ ", "icon": "โ๏ธ", "path": "/admin/settings", "color": "#48bb78"},
+ {"title": "๋ฐฑ์
๊ด๋ฆฌ", "icon": "๐พ", "path": "/admin/backup", "color": "#ed8936"},
+ {"title": "ํ๋ ๋ก๊ทธ", "icon": "๐", "path": "/admin/logs", "color": "#9f7aea"}
+ ],
+ "manager": [
+ {"title": "ํ๋ก์ ํธ ์์ฑ", "icon": "โ", "path": "/projects/new", "color": "#667eea"},
+ {"title": "ํ ๊ด๋ฆฌ", "icon": "๐ฅ", "path": "/team", "color": "#48bb78"},
+ {"title": "์งํ ์ํฉ", "icon": "๐", "path": "/progress", "color": "#38b2ac"},
+ {"title": "์น์ธ ์ฒ๋ฆฌ", "icon": "โ
", "path": "/approvals", "color": "#ed8936"}
+ ],
+ "designer": [
+ {"title": "BOM ์
๋ก๋", "icon": "๐ค", "path": "/upload", "color": "#667eea"},
+ {"title": "์์ฌ ๋ถ๋ฅ", "icon": "๐ง", "path": "/materials", "color": "#48bb78"},
+ {"title": "๋ฆฌ๋น์ ๊ด๋ฆฌ", "icon": "๐", "path": "/revisions", "color": "#38b2ac"},
+ {"title": "๋ถ๋ฅ ๊ฒ์ฆ", "icon": "โ
", "path": "/verify", "color": "#ed8936"}
+ ],
+ "purchaser": [
+ {"title": "๊ตฌ๋งค ํ์ ", "icon": "๐", "path": "/purchase", "color": "#667eea"},
+ {"title": "๋ฐ์ฃผ ๊ด๋ฆฌ", "icon": "๐", "path": "/orders", "color": "#48bb78"},
+ {"title": "๊ณต๊ธ์
์ฒด", "icon": "๐ข", "path": "/suppliers", "color": "#38b2ac"},
+ {"title": "์
๊ณ ์ฒ๋ฆฌ", "icon": "๐ฆ", "path": "/receiving", "color": "#ed8936"}
+ ],
+ "user": [
+ {"title": "๋ด ์
๋ฌด", "icon": "๐", "path": "/my-tasks", "color": "#667eea"},
+ {"title": "ํ๋ก์ ํธ ๋ณด๊ธฐ", "icon": "๐๏ธ", "path": "/projects", "color": "#48bb78"},
+ {"title": "๋ฆฌํฌํธ ๋ค์ด๋ก๋", "icon": "๐", "path": "/reports", "color": "#38b2ac"},
+ {"title": "๋์๋ง", "icon": "โ", "path": "/help", "color": "#9f7aea"}
+ ]
+ }
+
+ return {
+ "success": True,
+ "user_role": user_role,
+ "quick_actions": quick_actions.get(user_role, quick_actions["user"])
+ }
+
+ except Exception as e:
+ logger.error(f"Quick actions error: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"๋น ๋ฅธ ์์
์กฐํ ์คํจ: {str(e)}")
diff --git a/backend/app/routers/files.py b/backend/app/routers/files.py
index a416ba4..e91ed98 100644
--- a/backend/app/routers/files.py
+++ b/backend/app/routers/files.py
@@ -1,7 +1,7 @@
-from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form
+from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form, Request, Query, Body
from sqlalchemy.orm import Session
from sqlalchemy import text
-from typing import List, Optional
+from typing import List, Optional, Dict
import os
import shutil
from datetime import datetime
@@ -12,6 +12,8 @@ from pathlib import Path
import json
from ..database import get_db
+from ..auth.middleware import get_current_user
+from ..services.activity_logger import ActivityLogger, log_activity_from_request
from ..utils.logger import get_logger
from app.services.material_classifier import classify_material
@@ -25,6 +27,7 @@ from app.services.gasket_classifier import classify_gasket
from app.services.instrument_classifier import classify_instrument
from app.services.pipe_classifier import classify_pipe
from app.services.valve_classifier import classify_valve
+from app.services.revision_comparator import get_revision_comparison
router = APIRouter()
@@ -32,13 +35,7 @@ UPLOAD_DIR = Path("uploads")
UPLOAD_DIR.mkdir(exist_ok=True)
ALLOWED_EXTENSIONS = {".xlsx", ".xls", ".csv"}
-@router.get("/")
-async def get_files_info():
- return {
- "message": "ํ์ผ ๊ด๋ฆฌ API",
- "allowed_extensions": list(ALLOWED_EXTENSIONS),
- "upload_directory": str(UPLOAD_DIR)
- }
+# API ์ ๋ณด๋ /info ์๋ํฌ์ธํธ๋ก ์ด๋๋จ
@router.get("/test")
async def test_endpoint():
@@ -74,9 +71,9 @@ def generate_unique_filename(original_filename: str) -> str:
def parse_dataframe(df):
df = df.dropna(how='all')
# ์๋ณธ ์ปฌ๋ผ๋ช
์ถ๋ ฅ
- print(f"์๋ณธ ์ปฌ๋ผ๋ค: {list(df.columns)}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
df.columns = df.columns.str.strip().str.lower()
- print(f"์๋ฌธ์ ๋ณํ ํ: {list(df.columns)}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
column_mapping = {
'description': ['description', 'item', 'material', 'ํ๋ช
', '์์ฌ๋ช
'],
@@ -96,7 +93,7 @@ def parse_dataframe(df):
mapped_columns[standard_col] = possible_name
break
- print(f"์ฐพ์ ์ปฌ๋ผ ๋งคํ: {mapped_columns}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
materials = []
for index, row in df.iterrows():
@@ -172,20 +169,16 @@ def parse_file_data(file_path):
@router.post("/upload")
async def upload_file(
+ request: Request,
file: UploadFile = File(...),
job_no: str = Form(...),
revision: str = Form("Rev.0"), # ๊ธฐ๋ณธ๊ฐ์ Rev.0 (์ BOM)
parent_file_id: Optional[int] = Form(None), # ๋ฆฌ๋น์ ์
๋ก๋ ์ ๋ถ๋ชจ ํ์ผ ID
bom_name: Optional[str] = Form(None), # BOM ์ด๋ฆ (์ฌ์ฉ์ ์
๋ ฅ)
- db: Session = Depends(get_db)
+ db: Session = Depends(get_db),
+ current_user: dict = Depends(get_current_user)
):
- print(f"๐ฅ ์
๋ก๋ ์์ฒญ ๋ฐ์:")
- print(f" - ํ์ผ๋ช
: {file.filename}")
- print(f" - job_no: {job_no}")
- print(f" - revision: {revision}")
- print(f" - parent_file_id: {parent_file_id}")
- print(f" - bom_name: {bom_name}")
- print(f" - parent_file_id ํ์
: {type(parent_file_id)}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
if not validate_file_extension(file.filename):
raise HTTPException(
status_code=400,
@@ -199,22 +192,22 @@ async def upload_file(
file_path = UPLOAD_DIR / unique_filename
try:
- print("ํ์ผ ์ ์ฅ ์์")
+ # ๋ก๊ทธ ์ ๊ฑฐ
with open(file_path, "wb") as buffer:
shutil.copyfileobj(file.file, buffer)
- print(f"ํ์ผ ์ ์ฅ ์๋ฃ: {file_path}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
except Exception as e:
raise HTTPException(status_code=500, detail=f"ํ์ผ ์ ์ฅ ์คํจ: {str(e)}")
try:
- print("ํ์ผ ํ์ฑ ์์")
+ # ๋ก๊ทธ ์ ๊ฑฐ
materials_data = parse_file_data(str(file_path))
parsed_count = len(materials_data)
- print(f"ํ์ฑ ์๋ฃ: {parsed_count}๊ฐ ์์ฌ")
+ # ๋ก๊ทธ ์ ๊ฑฐ
# ๋ฆฌ๋น์ ์
๋ก๋์ธ ๊ฒฝ์ฐ๋ง ์๋ ๋ฆฌ๋น์ ์์ฑ
if parent_file_id is not None:
- print(f"๋ฆฌ๋น์ ์
๋ก๋ ๋ชจ๋: parent_file_id = {parent_file_id}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
# ๋ถ๋ชจ ํ์ผ์ ์ ๋ณด ์กฐํ
parent_query = text("""
SELECT original_filename, revision, bom_name FROM files
@@ -268,11 +261,14 @@ async def upload_file(
# ์ผ๋ฐ ์
๋ก๋ (์ BOM)
print(f"์ผ๋ฐ ์
๋ก๋ ๋ชจ๋: ์ BOM ํ์ผ (Rev.0)")
- # ํ์ผ ์ ๋ณด ์ ์ฅ
+ # ํ์ผ ์ ๋ณด ์ ์ฅ (์ฌ์ฉ์ ์ ๋ณด ํฌํจ)
print("DB ์ ์ฅ ์์")
+ username = current_user.get('username', 'unknown')
+ user_id = current_user.get('user_id')
+
file_insert_query = text("""
- INSERT INTO files (filename, original_filename, file_path, job_no, revision, bom_name, description, file_size, parsed_count, is_active)
- VALUES (:filename, :original_filename, :file_path, :job_no, :revision, :bom_name, :description, :file_size, :parsed_count, :is_active)
+ INSERT INTO files (filename, original_filename, file_path, job_no, revision, bom_name, description, file_size, parsed_count, is_active, uploaded_by)
+ VALUES (:filename, :original_filename, :file_path, :job_no, :revision, :bom_name, :description, :file_size, :parsed_count, :is_active, :uploaded_by)
RETURNING id
""")
@@ -286,11 +282,43 @@ async def upload_file(
"description": f"BOM ํ์ผ - {parsed_count}๊ฐ ์์ฌ",
"file_size": file.size,
"parsed_count": parsed_count,
- "is_active": True
+ "is_active": True,
+ "uploaded_by": username
})
file_id = file_result.fetchone()[0]
- print(f"ํ์ผ ์ ์ฅ ์๋ฃ: file_id = {file_id}")
+ print(f"ํ์ผ ์ ์ฅ ์๋ฃ: file_id = {file_id}, uploaded_by = {username}")
+
+ # ๐ ๋ฆฌ๋น์ ๋น๊ต ์ํ (RULES.md ์ฝ๋ฉ ์ปจ๋ฒค์
์ค์)
+ revision_comparison = None
+ materials_to_classify = materials_data
+
+ if revision != "Rev.0": # ๋ฆฌ๋น์ ์
๋ก๋์ธ ๊ฒฝ์ฐ๋ง ๋น๊ต
+ # ๋ก๊ทธ ์ ๊ฑฐ
+ try:
+ revision_comparison = get_revision_comparison(db, job_no, revision, materials_data)
+
+ if revision_comparison.get("has_previous_confirmation", False):
+ print(f"๐ ๋ฆฌ๋น์ ๋น๊ต ๊ฒฐ๊ณผ:")
+ print(f" - ๋ณ๊ฒฝ์์: {revision_comparison.get('unchanged_count', 0)}๊ฐ")
+ print(f" - ๋ณ๊ฒฝ๋จ: {revision_comparison.get('changed_count', 0)}๊ฐ")
+ print(f" - ์ ๊ท: {revision_comparison.get('new_count', 0)}๊ฐ")
+ print(f" - ์ญ์ ๋จ: {revision_comparison.get('removed_count', 0)}๊ฐ")
+ print(f" - ๋ถ๋ฅ ํ์: {revision_comparison.get('classification_needed', 0)}๊ฐ")
+
+ # ๋ถ๋ฅ๊ฐ ํ์ํ ์์ฌ๋ง ์ถ์ถ (๋ณ๊ฒฝ๋จ + ์ ๊ท)
+ materials_to_classify = (
+ revision_comparison.get("changed_materials", []) +
+ revision_comparison.get("new_materials", [])
+ )
+ else:
+ print("๐ ์ด์ ํ์ ์๋ฃ ์์ - ์ ์ฒด ์์ฌ ๋ถ๋ฅ")
+
+ except Exception as e:
+ logger.error(f"๋ฆฌ๋น์ ๋น๊ต ์คํจ: {str(e)}")
+ print(f"โ ๏ธ ๋ฆฌ๋น์ ๋น๊ต ์คํจ, ์ ์ฒด ์์ฌ ๋ถ๋ฅ๋ก ์งํ: {str(e)}")
+
+ print(f"๐ง ์์ฌ ๋ถ๋ฅ ์์: {len(materials_to_classify)}๊ฐ ์์ฌ")
# ์์ฌ ๋ฐ์ดํฐ ์ ์ฅ (๋ถ๋ฅ ํฌํจ) - ๋ฐฐ์น ์ฒ๋ฆฌ๋ก ์ฑ๋ฅ ๊ฐ์
materials_to_insert = []
@@ -301,7 +329,30 @@ async def upload_file(
flange_details_to_insert = []
materials_inserted = 0
- for material_data in materials_data:
+
+ # ๋ณ๊ฒฝ์๋ ์์ฌ ๋จผ์ ์ฒ๋ฆฌ (๊ธฐ์กด ๋ถ๋ฅ ๊ฒฐ๊ณผ ์ฌ์ฌ์ฉ)
+ if revision_comparison and revision_comparison.get("has_previous_confirmation", False):
+ unchanged_materials = revision_comparison.get("unchanged_materials", [])
+ for material_data in unchanged_materials:
+ previous_item = material_data.get("previous_item", {})
+
+ # ๊ธฐ์กด ๋ถ๋ฅ ๊ฒฐ๊ณผ ์ฌ์ฌ์ฉ
+ materials_to_insert.append({
+ "file_id": file_id,
+ "original_description": material_data["original_description"],
+ "classified_category": previous_item.get("category", "UNKNOWN"),
+ "confidence": 1.0, # ํ์ ๋ ์๋ฃ์ด๋ฏ๋ก ์ ๋ขฐ๋ 100%
+ "quantity": material_data["quantity"],
+ "unit": material_data.get("unit", "EA"),
+ "size_spec": material_data.get("size_spec", ""),
+ "material_grade": previous_item.get("material", ""),
+ "specification": previous_item.get("specification", ""),
+ "reused_from_confirmation": True
+ })
+ materials_inserted += 1
+
+ # ๋ถ๋ฅ๊ฐ ํ์ํ ์์ฌ ์ฒ๋ฆฌ
+ for material_data in materials_to_classify:
# ์์ฌ ํ์
๋ถ๋ฅ๊ธฐ ์ ์ฉ (PIPE, FITTING, VALVE ๋ฑ)
description = material_data["original_description"]
size_spec = material_data["size_spec"]
@@ -338,7 +389,8 @@ async def upload_file(
material_type = integrated_result.get('category', 'UNKNOWN')
if material_type == "PIPE":
- classification_result = classify_pipe("", description, main_nom or "", length_value)
+ from ..services.pipe_classifier import classify_pipe_for_purchase
+ classification_result = classify_pipe_for_purchase("", description, main_nom or "", length_value)
elif material_type == "FITTING":
classification_result = classify_fitting("", description, main_nom or "", red_nom)
elif material_type == "FLANGE":
@@ -414,8 +466,39 @@ async def upload_file(
if classification_result.get("category") == "PIPE":
print("PIPE ์์ธ ์ ๋ณด ์ ์ฅ ์์")
- # ๊ธธ์ด ์ ๋ณด ์ถ์ถ - material_data์์ ์ง์ ๊ฐ์ ธ์ด
- length_mm = material_data.get("length", 0.0) if material_data.get("length") else None
+ # ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ถ์ถ ๋ฐ ์ ์ฅ
+ from ..services.pipe_classifier import extract_end_preparation_info
+ end_prep_info = extract_end_preparation_info(description)
+
+ # ๋๋จ ๊ฐ๊ณต ์ ๋ณด ํ
์ด๋ธ์ ์ ์ฅ
+ end_prep_insert_query = text("""
+ INSERT INTO pipe_end_preparations (
+ material_id, file_id, end_preparation_type, end_preparation_code,
+ machining_required, cutting_note, original_description, clean_description,
+ confidence, matched_pattern
+ ) VALUES (
+ :material_id, :file_id, :end_preparation_type, :end_preparation_code,
+ :machining_required, :cutting_note, :original_description, :clean_description,
+ :confidence, :matched_pattern
+ )
+ """)
+
+ db.execute(end_prep_insert_query, {
+ "material_id": material_id,
+ "file_id": file_id,
+ "end_preparation_type": end_prep_info["end_preparation_type"],
+ "end_preparation_code": end_prep_info["end_preparation_code"],
+ "machining_required": end_prep_info["machining_required"],
+ "cutting_note": end_prep_info["cutting_note"],
+ "original_description": end_prep_info["original_description"],
+ "clean_description": end_prep_info["clean_description"],
+ "confidence": end_prep_info["confidence"],
+ "matched_pattern": end_prep_info["matched_pattern"]
+ })
+
+ # ๊ธธ์ด ์ ๋ณด ์ถ์ถ - ๋ถ๋ฅ ๊ฒฐ๊ณผ์ length_info ์ฐ์ ์ฌ์ฉ
+ length_info = classification_result.get("length_info", {})
+ length_mm = length_info.get("length_mm") or material_data.get("length", 0.0) if material_data.get("length") else None
# material_id๋ ํจ๊ป ์ ์ฅํ๋๋ก ์์
pipe_detail_insert_query = text("""
@@ -961,6 +1044,25 @@ async def upload_file(
db.commit()
print(f"์์ฌ ์ ์ฅ ์๋ฃ: {materials_inserted}๊ฐ")
+ # ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+ try:
+ activity_logger = ActivityLogger(db)
+ activity_logger.log_file_upload(
+ username=username,
+ file_id=file_id,
+ filename=file.filename,
+ file_size=file.size or 0,
+ job_no=job_no,
+ revision=revision,
+ user_id=user_id,
+ ip_address=request.client.host if request.client else None,
+ user_agent=request.headers.get('user-agent')
+ )
+ print(f"ํ๋ ๋ก๊ทธ ๊ธฐ๋ก ์๋ฃ: {username} - ํ์ผ ์
๋ก๋")
+ except Exception as e:
+ print(f"ํ๋ ๋ก๊ทธ ๊ธฐ๋ก ์คํจ: {str(e)}")
+ # ๋ก๊ทธ ์คํจ๋ ์
๋ก๋ ์ฑ๊ณต์ ์ํฅ์ ์ฃผ์ง ์์
+
return {
"success": True,
"message": f"์
๋ก๋ ์ฑ๊ณต! {materials_inserted}๊ฐ ์์ฌ๊ฐ ๋ถ๋ฅ๋์์ต๋๋ค.",
@@ -969,6 +1071,7 @@ async def upload_file(
"materials_count": materials_inserted,
"saved_materials_count": materials_inserted,
"revision": revision, # ์์ฑ๋ ๋ฆฌ๋น์ ์ ๋ณด ์ถ๊ฐ
+ "uploaded_by": username, # ์
๋ก๋ํ ์ฌ์ฉ์ ์ ๋ณด ์ถ๊ฐ
"parsed_count": parsed_count
}
@@ -978,7 +1081,7 @@ async def upload_file(
os.remove(file_path)
raise HTTPException(status_code=500, detail=f"ํ์ผ ์ฒ๋ฆฌ ์คํจ: {str(e)}")
-@router.get("/files")
+@router.get("/")
async def get_files(
job_no: Optional[str] = None,
db: Session = Depends(get_db)
@@ -986,7 +1089,7 @@ async def get_files(
"""ํ์ผ ๋ชฉ๋ก ์กฐํ"""
try:
query = """
- SELECT id, filename, original_filename, job_no, revision,
+ SELECT id, filename, original_filename, bom_name, job_no, revision,
description, file_size, parsed_count, upload_date, is_active
FROM files
WHERE is_active = TRUE
@@ -1007,6 +1110,7 @@ async def get_files(
"id": file.id,
"filename": file.filename,
"original_filename": file.original_filename,
+ "bom_name": file.bom_name,
"job_no": file.job_no,
"revision": file.revision,
"description": file.description,
@@ -1062,7 +1166,7 @@ async def get_files_stats(db: Session = Depends(get_db)):
except Exception as e:
raise HTTPException(status_code=500, detail=f"ํต๊ณ ์กฐํ ์คํจ: {str(e)}")
-@router.delete("/files/{file_id}")
+@router.delete("/delete/{file_id}")
async def delete_file(file_id: int, db: Session = Depends(get_db)):
"""ํ์ผ ์ญ์ """
try:
@@ -1086,7 +1190,7 @@ async def delete_file(file_id: int, db: Session = Depends(get_db)):
db.rollback()
raise HTTPException(status_code=500, detail=f"ํ์ผ ์ญ์ ์คํจ: {str(e)}")
-@router.get("/materials")
+@router.get("/materials-v2") # ์์ ํ ์๋ก์ด ์๋ํฌ์ธํธ
async def get_materials(
project_id: Optional[int] = None,
file_id: Optional[int] = None,
@@ -1104,22 +1208,56 @@ async def get_materials(
db: Session = Depends(get_db)
):
"""
- ์ ์ฅ๋ ์์ฌ ๋ชฉ๋ก ์กฐํ (job_no, filename, revision 3๊ฐ์ง๋ก ํํฐ๋ง ๊ฐ๋ฅ)
+ ์ ์ฅ๋ ์์ฌ ๋ชฉ๋ก ์กฐํ (job_no, filename, revision 3๊ฐ์ง๋ก ํํฐ๋ง ๊ฐ๋ฅ) - ์ ๋ฒ์
"""
try:
+ # ๋ก๊ทธ ์ ๊ฑฐ - ๊ณผ๋ํ ์ถ๋ ฅ ๋ฐฉ์ง
query = """
SELECT m.id, m.file_id, m.original_description, m.quantity, m.unit,
m.size_spec, m.main_nom, m.red_nom, m.material_grade, m.line_number, m.row_number,
m.created_at, m.classified_category, m.classification_confidence,
m.classification_details,
+ m.is_verified, m.verified_by, m.verified_at,
f.original_filename, f.project_id, f.job_no, f.revision,
p.official_project_code, p.project_name,
pd.outer_diameter, pd.schedule, pd.material_spec, pd.manufacturing_method,
- pd.end_preparation, pd.length_mm
+ pd.end_preparation, pd.length_mm,
+ pep.end_preparation_type, pep.end_preparation_code, pep.machining_required,
+ pep.cutting_note, pep.clean_description as pipe_clean_description,
+ fd.fitting_type, fd.fitting_subtype, fd.connection_method, fd.pressure_rating,
+ fd.material_standard, fd.material_grade as fitting_material_grade, fd.main_size,
+ fd.reduced_size, fd.length_mm as fitting_length_mm, fd.schedule as fitting_schedule,
+ mpt.confirmed_quantity, mpt.purchase_status, mpt.confirmed_by, mpt.confirmed_at,
+ -- ๊ตฌ๋งค์๋ ๊ณ์ฐ์์ ๋ถ๋ฅ๋ ์ ๋ณด๋ฅผ ์ฐ์ ์ฌ์ฉ
+ CASE
+ WHEN mpt.id IS NOT NULL THEN
+ CASE
+ WHEN mpt.description LIKE '%PIPE%' OR mpt.description LIKE '%ํ์ดํ%' THEN 'PIPE'
+ WHEN mpt.description LIKE '%FITTING%' OR mpt.description LIKE '%ํผํ
%' OR mpt.description LIKE '%NIPPLE%' OR mpt.description LIKE '%ELBOW%' OR mpt.description LIKE '%TEE%' OR mpt.description LIKE '%REDUCER%' THEN 'FITTING'
+ WHEN mpt.description LIKE '%VALVE%' OR mpt.description LIKE '%๋ฐธ๋ธ%' THEN 'VALVE'
+ WHEN mpt.description LIKE '%FLANGE%' OR mpt.description LIKE '%ํ๋์ง%' THEN 'FLANGE'
+ WHEN mpt.description LIKE '%BOLT%' OR mpt.description LIKE '%๋ณผํธ%' OR mpt.description LIKE '%STUD%' THEN 'BOLT'
+ WHEN mpt.description LIKE '%GASKET%' OR mpt.description LIKE '%๊ฐ์ค์ผ%' THEN 'GASKET'
+ WHEN mpt.description LIKE '%INSTRUMENT%' OR mpt.description LIKE '%๊ณ๊ธฐ%' THEN 'INSTRUMENT'
+ ELSE m.classified_category
+ END
+ ELSE m.classified_category
+ END as final_classified_category,
+ -- ๊ตฌ๋งค์๋ ๊ณ์ฐ ์๋ฃ ์ฌ๋ถ
+ CASE WHEN mpt.id IS NOT NULL THEN true ELSE m.is_verified END as final_is_verified,
+ CASE WHEN mpt.id IS NOT NULL THEN 'purchase_calculation' ELSE m.verified_by END as final_verified_by
FROM materials m
LEFT JOIN files f ON m.file_id = f.id
LEFT JOIN projects p ON f.project_id = p.id
LEFT JOIN pipe_details pd ON m.id = pd.material_id
+ LEFT JOIN pipe_end_preparations pep ON m.id = pep.material_id
+ LEFT JOIN fitting_details fd ON m.id = fd.material_id
+ LEFT JOIN valve_details vd ON m.id = vd.material_id
+ LEFT JOIN material_purchase_tracking mpt ON (
+ m.material_hash = mpt.material_hash
+ AND f.job_no = mpt.job_no
+ AND f.revision = mpt.revision
+ )
WHERE 1=1
"""
params = {}
@@ -1220,9 +1358,34 @@ async def get_materials(
count_result = db.execute(text(count_query), count_params)
total_count = count_result.fetchone()[0]
+ # ํ์ดํ ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ pipe_groups = {}
+ # ๋ํ ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ (๊ธธ์ด ๊ธฐ๋ฐ)
+ nipple_groups = {}
+ # ์ผ๋ฐ ํผํ
๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ (์๋ ๊ธฐ๋ฐ)
+ fitting_groups = {}
+ # ํ๋์ง ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ flange_groups = {}
+ # ๋ฐธ๋ธ ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ valve_groups = {}
+ # ๋ณผํธ ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ bolt_groups = {}
+ # ๊ฐ์ค์ผ ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ gasket_groups = {}
+ # UNKNOWN ๊ทธ๋ฃนํ์ ์ํ ๋์
๋๋ฆฌ
+ unknown_groups = {}
+
# ๊ฐ ์์ฌ์ ์์ธ ์ ๋ณด๋ ๊ฐ์ ธ์ค๊ธฐ
material_list = []
+ valve_count = 0
for m in materials:
+ if m.classified_category == 'VALVE':
+ valve_count += 1
+ # ๋๋ฒ๊น
: ์ฒซ ๋ฒ์งธ ์์ฌ์ ๋ชจ๋ ์์ฑ ์ถ๋ ฅ
+ if len(material_list) == 0:
+ # ๋ก๊ทธ ์ ๊ฑฐ
+ pass
+
material_dict = {
"id": m.id,
"file_id": m.file_id,
@@ -1239,9 +1402,18 @@ async def get_materials(
"material_grade": m.material_grade,
"line_number": m.line_number,
"row_number": m.row_number,
- "classified_category": m.classified_category,
+ # ๊ตฌ๋งค์๋ ๊ณ์ฐ์์ ๋ถ๋ฅ๋ ์ ๋ณด๋ฅผ ์ฐ์ ์ฌ์ฉ
+ "classified_category": m.final_classified_category or m.classified_category,
"classification_confidence": float(m.classification_confidence) if m.classification_confidence else 0.0,
"classification_details": m.classification_details,
+ "is_verified": m.final_is_verified if m.final_is_verified is not None else m.is_verified,
+ "verified_by": m.final_verified_by or m.verified_by,
+ "verified_at": m.verified_at,
+ "purchase_confirmed": bool(m.confirmed_quantity),
+ "confirmed_quantity": float(m.confirmed_quantity) if m.confirmed_quantity else None,
+ "purchase_status": m.purchase_status,
+ "purchase_confirmed_by": m.confirmed_by,
+ "purchase_confirmed_at": m.confirmed_at,
"created_at": m.created_at
}
@@ -1249,7 +1421,7 @@ async def get_materials(
if m.classified_category == 'PIPE':
# JOIN๋ ๊ฒฐ๊ณผ์์ pipe_details ์ ๋ณด ๊ฐ์ ธ์ค๊ธฐ
if hasattr(m, 'outer_diameter') and m.outer_diameter is not None:
- material_dict['pipe_details'] = {
+ pipe_details = {
"outer_diameter": m.outer_diameter,
"schedule": m.schedule,
"material_spec": m.material_spec,
@@ -1257,23 +1429,119 @@ async def get_materials(
"end_preparation": m.end_preparation,
"length_mm": float(m.length_mm) if m.length_mm else None
}
+
+ # ํ์ดํ ๊ทธ๋ฃนํ ํค ์์ฑ (๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ์ธํ๊ณ ๊ทธ๋ฃนํ)
+ # pep ํ
์ด๋ธ์์ clean_description์ ๊ฐ์ ธ์ค๊ฑฐ๋, ์์ผ๋ฉด ์ง์ ๊ณ์ฐ
+ if hasattr(m, 'pipe_clean_description') and m.pipe_clean_description:
+ clean_description = m.pipe_clean_description
+ else:
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ pipe_key = f"{clean_description}|{m.size_spec}|{m.material_grade}"
+
+ # ๋ก๊ทธ ์ ๊ฑฐ - ๊ณผ๋ํ ์ถ๋ ฅ ๋ฐฉ์ง
+
+ if pipe_key not in pipe_groups:
+ pipe_groups[pipe_key] = {
+ "total_length_mm": 0,
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ # ๊ฐ๋ณ ํ์ดํ ๊ธธ์ด ํฉ์ฐ (DB์ ์ ์ฅ๋ ์ค์ ๊ธธ์ด ์ฌ์ฉ)
+ if pipe_details["length_mm"]:
+ # โ
DB์์ ๊ฐ์ ธ์จ length_mm๋ ์ด๋ฏธ ๊ฐ๋ณ ํ์ดํ์ ์ค์ ๊ธธ์ด์ด๋ฏ๋ก ์๋์ ๊ณฑํ์ง ์์
+ individual_length = float(pipe_details["length_mm"])
+ pipe_groups[pipe_key]["total_length_mm"] += individual_length
+ pipe_groups[pipe_key]["total_quantity"] += 1 # ํ์ดํ ๊ฐ์๋ 1๊ฐ์ฉ ์ฆ๊ฐ
+ pipe_groups[pipe_key]["materials"].append(material_dict)
+
+ # ๊ฐ๋ณ ๊ธธ์ด ์ ๋ณด๋ฅผ pipe_details์ ์ถ๊ฐ
+ pipe_details["individual_total_length"] = individual_length
+
+ # ๊ตฌ๋งค์ฉ ๊นจ๋ํ ์ค๋ช
๋ ์ถ๊ฐ
+ material_dict['clean_description'] = clean_description
+ material_dict['pipe_details'] = pipe_details
elif m.classified_category == 'FITTING':
- fitting_query = text("SELECT * FROM fitting_details WHERE material_id = :material_id")
- fitting_result = db.execute(fitting_query, {"material_id": m.id})
- fitting_detail = fitting_result.fetchone()
- if fitting_detail:
- material_dict['fitting_details'] = {
- "fitting_type": fitting_detail.fitting_type,
- "fitting_subtype": fitting_detail.fitting_subtype,
- "connection_method": fitting_detail.connection_method,
- "pressure_rating": fitting_detail.pressure_rating,
- "material_standard": fitting_detail.material_standard,
- "material_grade": fitting_detail.material_grade,
- "main_size": fitting_detail.main_size,
- "reduced_size": fitting_detail.reduced_size,
- "length_mm": float(fitting_detail.length_mm) if fitting_detail.length_mm else None,
- "schedule": fitting_detail.schedule
+ # CAP๊ณผ PLUG ๋จผ์ ์ฒ๋ฆฌ (fitting_type์ด ์์ ์ ์์)
+ if 'CAP' in m.original_description.upper() or 'PLUG' in m.original_description.upper():
+ # CAP๊ณผ PLUG ๊ทธ๋ฃนํ
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ fitting_key = f"{clean_description}|{m.size_spec}|{m.material_grade}"
+
+ if fitting_key not in fitting_groups:
+ fitting_groups[fitting_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ fitting_groups[fitting_key]["total_quantity"] += material_dict["quantity"]
+ fitting_groups[fitting_key]["materials"].append(material_dict)
+ material_dict['clean_description'] = clean_description
+ # JOIN๋ fitting_details ๋ฐ์ดํฐ ์ง์ ์ฌ์ฉ
+ elif hasattr(m, 'fitting_type') and m.fitting_type is not None:
+ # ๋ก๊ทธ ์ ๊ฑฐ - ๊ณผ๋ํ ์ถ๋ ฅ ๋ฐฉ์ง
+
+ fitting_details = {
+ "fitting_type": m.fitting_type,
+ "fitting_subtype": m.fitting_subtype,
+ "connection_method": m.connection_method,
+ "pressure_rating": m.pressure_rating,
+ "material_standard": m.material_standard,
+ "material_grade": m.fitting_material_grade,
+ "main_size": m.main_size,
+ "reduced_size": m.reduced_size,
+ "length_mm": float(m.fitting_length_mm) if m.fitting_length_mm else None,
+ "schedule": m.fitting_schedule
}
+ material_dict['fitting_details'] = fitting_details
+
+ # ๋ํ์ธ ๊ฒฝ์ฐ ๊ธธ์ด ๊ธฐ๋ฐ ๊ทธ๋ฃนํ
+ if 'NIPPLE' in m.original_description.upper() and m.fitting_length_mm:
+ # ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ๊ฑฐ
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ nipple_key = f"{clean_description}|{m.size_spec}|{m.material_grade}|{m.fitting_length_mm}mm"
+
+ # ๋ก๊ทธ ์ ๊ฑฐ - ๊ณผ๋ํ ์ถ๋ ฅ ๋ฐฉ์ง
+
+ if nipple_key not in nipple_groups:
+ nipple_groups[nipple_key] = {
+ "total_length_mm": 0,
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ # ๊ฐ๋ณ ๋ํ ๊ธธ์ด ํฉ์ฐ (์๋ ร ๋จ์๊ธธ์ด) - ํ์
๋ณํ
+ individual_total_length = float(material_dict["quantity"]) * float(m.fitting_length_mm)
+ nipple_groups[nipple_key]["total_length_mm"] += individual_total_length
+ nipple_groups[nipple_key]["total_quantity"] += material_dict["quantity"]
+ nipple_groups[nipple_key]["materials"].append(material_dict)
+
+ # ์ด๊ธธ์ด ์ ๋ณด๋ฅผ fitting_details์ ์ถ๊ฐ
+ fitting_details["individual_total_length"] = individual_total_length
+ fitting_details["is_nipple"] = True
+
+ # ๊ตฌ๋งค์ฉ ๊นจ๋ํ ์ค๋ช
๋ ์ถ๊ฐ
+ material_dict['clean_description'] = clean_description
+ else:
+ # ์ผ๋ฐ ํผํ
(๋ํ์ด ์๋ ๊ฒฝ์ฐ) - ์๋ ๊ธฐ๋ฐ ๊ทธ๋ฃนํ
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ fitting_key = f"{clean_description}|{m.size_spec}|{m.material_grade}"
+
+ if fitting_key not in fitting_groups:
+ fitting_groups[fitting_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ fitting_groups[fitting_key]["total_quantity"] += material_dict["quantity"]
+ fitting_groups[fitting_key]["materials"].append(material_dict)
+
+ # ๊ตฌ๋งค์ฉ ๊นจ๋ํ ์ค๋ช
๋ ์ถ๊ฐ
+ material_dict['clean_description'] = clean_description
elif m.classified_category == 'FLANGE':
flange_query = text("SELECT * FROM flange_details WHERE material_id = :material_id")
flange_result = db.execute(flange_query, {"material_id": m.id})
@@ -1287,6 +1555,21 @@ async def get_materials(
"material_grade": flange_detail.material_grade,
"size_inches": flange_detail.size_inches
}
+
+ # ํ๋์ง ๊ทธ๋ฃนํ ์ถ๊ฐ
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ flange_key = f"{m.size_spec}|{m.material_grade}|{flange_detail.pressure_rating if flange_detail else ''}|{flange_detail.flange_type if flange_detail else ''}"
+
+ if flange_key not in flange_groups:
+ flange_groups[flange_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ flange_groups[flange_key]["total_quantity"] += material_dict["quantity"]
+ flange_groups[flange_key]["materials"].append(material_dict)
+ material_dict['clean_description'] = clean_description
elif m.classified_category == 'GASKET':
gasket_query = text("SELECT * FROM gasket_details WHERE material_id = :material_id")
gasket_result = db.execute(gasket_query, {"material_id": m.id})
@@ -1300,6 +1583,20 @@ async def get_materials(
"thickness": gasket_detail.thickness,
"temperature_range": gasket_detail.temperature_range
}
+
+ # ๊ฐ์ค์ผ ๊ทธ๋ฃนํ - ํฌ๊ธฐ, ์๋ ฅ, ์ฌ์ง๋ก ๊ทธ๋ฃนํ
+ # original_description์์ ์ฃผ์ ์ ๋ณด ์ถ์ถ
+ description = m.original_description or ''
+ gasket_key = f"{m.size_spec}|{description}"
+
+ if gasket_key not in gasket_groups:
+ gasket_groups[gasket_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ gasket_groups[gasket_key]["total_quantity"] += material_dict["quantity"]
+ gasket_groups[gasket_key]["materials"].append(material_dict)
elif m.classified_category == 'VALVE':
valve_query = text("SELECT * FROM valve_details WHERE material_id = :material_id")
valve_result = db.execute(valve_query, {"material_id": m.id})
@@ -1314,6 +1611,21 @@ async def get_materials(
"body_material": valve_detail.body_material,
"size_inches": valve_detail.size_inches
}
+
+ # ๋ฐธ๋ธ ๊ทธ๋ฃนํ ์ถ๊ฐ
+ from ..services.pipe_classifier import get_purchase_pipe_description
+ clean_description = get_purchase_pipe_description(m.original_description)
+ valve_key = f"{clean_description}|{m.size_spec}|{m.material_grade}"
+
+ if valve_key not in valve_groups:
+ valve_groups[valve_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ valve_groups[valve_key]["total_quantity"] += material_dict["quantity"]
+ valve_groups[valve_key]["materials"].append(material_dict)
+ material_dict['clean_description'] = clean_description
elif m.classified_category == 'BOLT':
bolt_query = text("SELECT * FROM bolt_details WHERE material_id = :material_id")
bolt_result = db.execute(bolt_query, {"material_id": m.id})
@@ -1329,8 +1641,199 @@ async def get_materials(
"coating_type": bolt_detail.coating_type,
"pressure_rating": bolt_detail.pressure_rating
}
+
+ # ๋ณผํธ ๊ทธ๋ฃนํ ์ถ๊ฐ - ํฌ๊ธฐ, ์ฌ์ง, ๊ธธ์ด๋ก ๊ทธ๋ฃนํ
+ # ์๋ณธ ์ค๋ช
์์ ๊ธธ์ด ์ถ์ถ
+ import re
+ length_match = re.search(r'(\d+(?:\.\d+)?)\s*(?:LG|MM)', m.original_description.upper())
+ bolt_length = length_match.group(1) if length_match else 'UNKNOWN'
+
+ bolt_key = f"{m.size_spec}|{m.material_grade}|{bolt_length}"
+
+ if bolt_key not in bolt_groups:
+ bolt_groups[bolt_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ bolt_groups[bolt_key]["total_quantity"] += material_dict["quantity"]
+ bolt_groups[bolt_key]["materials"].append(material_dict)
- material_list.append(material_dict)
+ # ํ์ดํ, ๋ํ, ์ผ๋ฐ ํผํ
, ํ๋์ง๊ฐ ์๋ ๊ฒฝ์ฐ๋ง ๋ฐ๋ก ์ถ๊ฐ (์ด๋ค์ ๊ทธ๋ฃนํ ํ ์ถ๊ฐ)
+ is_nipple = (m.classified_category == 'FITTING' and
+ ('NIPPLE' in m.original_description.upper() or
+ (hasattr(m, 'fitting_type') and m.fitting_type == 'NIPPLE')))
+
+ # CAP๊ณผ PLUG๋ ์ผ๋ฐ ํผํ
์ผ๋ก ์ฒ๋ฆฌ
+ is_cap_or_plug = (m.classified_category == 'FITTING' and
+ ('CAP' in m.original_description.upper() or 'PLUG' in m.original_description.upper()))
+
+ is_general_fitting = (m.classified_category == 'FITTING' and not is_nipple and
+ ((hasattr(m, 'fitting_type') and m.fitting_type is not None) or is_cap_or_plug))
+
+ is_flange = (m.classified_category == 'FLANGE')
+
+ is_valve = (m.classified_category == 'VALVE')
+
+ is_bolt = (m.classified_category == 'BOLT')
+
+ is_gasket = (m.classified_category == 'GASKET')
+
+ # UNKNOWN ์นดํ
๊ณ ๋ฆฌ ๊ทธ๋ฃนํ ์ฒ๋ฆฌ
+ if m.classified_category == 'UNKNOWN':
+ unknown_key = m.original_description or 'UNKNOWN'
+
+ if unknown_key not in unknown_groups:
+ unknown_groups[unknown_key] = {
+ "total_quantity": 0,
+ "materials": []
+ }
+
+ unknown_groups[unknown_key]["total_quantity"] += material_dict["quantity"]
+ unknown_groups[unknown_key]["materials"].append(material_dict)
+ elif m.classified_category != 'PIPE' and not is_nipple and not is_general_fitting and not is_flange and not is_valve and not is_bolt and not is_gasket:
+ material_list.append(material_dict)
+
+ # ํ์ดํ ๊ทธ๋ฃน๋ณ๋ก ๋ํ ํ์ดํ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ for pipe_key, group_info in pipe_groups.items():
+ if group_info["materials"]:
+ # ๊ทธ๋ฃน์ ์ฒซ ๋ฒ์งธ ํ์ดํ๋ฅผ ๋ํ๋ก ์ฌ์ฉ
+ representative_pipe = group_info["materials"][0].copy()
+
+ # ๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก ์
๋ฐ์ดํธ
+ representative_pipe['quantity'] = group_info["total_quantity"]
+ representative_pipe['original_description'] = representative_pipe['clean_description'] # ๊นจ๋ํ ์ค๋ช
์ฌ์ฉ
+
+ if 'pipe_details' in representative_pipe:
+ representative_pipe['pipe_details']['total_length_mm'] = group_info["total_length_mm"]
+ representative_pipe['pipe_details']['pipe_count'] = group_info["total_quantity"] # โ
pipe_count ์ถ๊ฐ
+ representative_pipe['pipe_details']['group_total_quantity'] = group_info["total_quantity"]
+ # ํ๊ท ๋จ์ ๊ธธ์ด ๊ณ์ฐ
+ if group_info["total_quantity"] > 0:
+ representative_pipe['pipe_details']['avg_length_mm'] = group_info["total_length_mm"] / group_info["total_quantity"]
+
+ material_list.append(representative_pipe)
+
+ # ๋ํ ๊ทธ๋ฃน๋ณ๋ก ๋ํ ๋ํ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ try:
+ for nipple_key, group_info in nipple_groups.items():
+ if group_info["materials"]:
+ # ๊ทธ๋ฃน์ ์ฒซ ๋ฒ์งธ ๋ํ์ ๋ํ๋ก ์ฌ์ฉ
+ representative_nipple = group_info["materials"][0].copy()
+
+ # ๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก ์
๋ฐ์ดํธ
+ representative_nipple['quantity'] = group_info["total_quantity"]
+ representative_nipple['original_description'] = representative_nipple.get('clean_description', representative_nipple['original_description']) # ๊นจ๋ํ ์ค๋ช
์ฌ์ฉ
+
+ if 'fitting_details' in representative_nipple:
+ representative_nipple['fitting_details']['total_length_mm'] = group_info["total_length_mm"]
+ representative_nipple['fitting_details']['group_total_quantity'] = group_info["total_quantity"]
+ # ํ๊ท ๋จ์ ๊ธธ์ด ๊ณ์ฐ
+ if group_info["total_quantity"] > 0:
+ representative_nipple['fitting_details']['avg_length_mm'] = group_info["total_length_mm"] / group_info["total_quantity"]
+
+ material_list.append(representative_nipple)
+ except Exception as nipple_error:
+ # ๋ก๊ทธ ์ ๊ฑฐ
+ # ๋ํ ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # ์ผ๋ฐ ํผํ
๊ทธ๋ฃน๋ณ๋ก ๋ํ ํผํ
ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ try:
+ for fitting_key, group_info in fitting_groups.items():
+ if group_info["materials"]:
+ representative_fitting = group_info["materials"][0].copy()
+ representative_fitting['quantity'] = group_info["total_quantity"]
+ representative_fitting['original_description'] = representative_fitting.get('clean_description', representative_fitting['original_description'])
+
+ if 'fitting_details' in representative_fitting:
+ representative_fitting['fitting_details']['group_total_quantity'] = group_info["total_quantity"]
+ material_list.append(representative_fitting)
+ except Exception as fitting_error:
+ # ๋ก๊ทธ ์ ๊ฑฐ
+ # ํผํ
๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # ํ๋์ง ๊ทธ๋ฃน๋ณ๋ก ๋ํ ํ๋์ง ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ try:
+ for flange_key, group_info in flange_groups.items():
+ if group_info["materials"]:
+ representative_flange = group_info["materials"][0].copy()
+ representative_flange['quantity'] = group_info["total_quantity"]
+ # original_description์ ๊ทธ๋๋ก ์ ์ง (SCH ์ ๋ณด ๋ณด์กด)
+ # representative_flange['original_description'] = representative_flange.get('clean_description', representative_flange['original_description'])
+
+ if 'flange_details' in representative_flange:
+ representative_flange['flange_details']['group_total_quantity'] = group_info["total_quantity"]
+ material_list.append(representative_flange)
+ except Exception as flange_error:
+ # ํ๋์ง ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # ๋ฐธ๋ธ ๊ทธ๋ฃน๋ณ๋ก ๋ํ ๋ฐธ๋ธ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ print(f"DEBUG: ์ ์ฒด ๋ฐธ๋ธ ์: {valve_count}, valve_groups ์: {len(valve_groups)}")
+ try:
+ for valve_key, group_info in valve_groups.items():
+ if group_info["materials"]:
+ representative_valve = group_info["materials"][0].copy()
+ representative_valve['quantity'] = group_info["total_quantity"]
+
+ if 'valve_details' in representative_valve:
+ representative_valve['valve_details']['group_total_quantity'] = group_info["total_quantity"]
+ material_list.append(representative_valve)
+ print(f"DEBUG: ๋ฐธ๋ธ ์ถ๊ฐ๋จ - {valve_key}, ์๋: {group_info['total_quantity']}")
+ except Exception as valve_error:
+ print(f"ERROR: ๋ฐธ๋ธ ๊ทธ๋ฃนํ ์คํจ - {valve_error}")
+ # ๋ฐธ๋ธ ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # ๋ณผํธ ๊ทธ๋ฃน๋ณ๋ก ๋ํ ๋ณผํธ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ print(f"DEBUG: bolt_groups ์: {len(bolt_groups)}")
+ try:
+ for bolt_key, group_info in bolt_groups.items():
+ if group_info["materials"]:
+ representative_bolt = group_info["materials"][0].copy()
+ representative_bolt['quantity'] = group_info["total_quantity"]
+
+ if 'bolt_details' in representative_bolt:
+ representative_bolt['bolt_details']['group_total_quantity'] = group_info["total_quantity"]
+ material_list.append(representative_bolt)
+ print(f"DEBUG: ๋ณผํธ ์ถ๊ฐ๋จ - {bolt_key}, ์๋: {group_info['total_quantity']}")
+ except Exception as bolt_error:
+ print(f"ERROR: ๋ณผํธ ๊ทธ๋ฃนํ ์คํจ - {bolt_error}")
+ # ๋ณผํธ ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # ๊ฐ์ค์ผ ๊ทธ๋ฃน๋ณ๋ก ๋ํ ๊ฐ์ค์ผ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ print(f"DEBUG: gasket_groups ์: {len(gasket_groups)}")
+ try:
+ for gasket_key, group_info in gasket_groups.items():
+ if group_info["materials"]:
+ representative_gasket = group_info["materials"][0].copy()
+ representative_gasket['quantity'] = group_info["total_quantity"]
+
+ if 'gasket_details' in representative_gasket:
+ representative_gasket['gasket_details']['group_total_quantity'] = group_info["total_quantity"]
+ material_list.append(representative_gasket)
+ print(f"DEBUG: ๊ฐ์ค์ผ ์ถ๊ฐ๋จ - {gasket_key}, ์๋: {group_info['total_quantity']}")
+ except Exception as gasket_error:
+ print(f"ERROR: ๊ฐ์ค์ผ ๊ทธ๋ฃนํ ์คํจ - {gasket_error}")
+ # ๊ฐ์ค์ผ ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
+
+ # UNKNOWN ๊ทธ๋ฃน๋ณ๋ก ๋ํ ํญ๋ชฉ ํ๋๋ง ์ถ๊ฐ (๊ทธ๋ฃนํ๋ ์ ๋ณด๋ก)
+ print(f"DEBUG: unknown_groups ์: {len(unknown_groups)}")
+ try:
+ for unknown_key, group_info in unknown_groups.items():
+ if group_info["materials"]:
+ representative_unknown = group_info["materials"][0].copy()
+ representative_unknown['quantity'] = group_info["total_quantity"]
+ material_list.append(representative_unknown)
+ print(f"DEBUG: UNKNOWN ์ถ๊ฐ๋จ - {unknown_key[:50]}, ์๋: {group_info['total_quantity']}")
+ except Exception as unknown_error:
+ print(f"ERROR: UNKNOWN ๊ทธ๋ฃนํ ์คํจ - {unknown_error}")
+ # UNKNOWN ๊ทธ๋ฃนํ ์คํจ์์๋ ๊ณ์ ์งํ
+ pass
return {
"success": True,
@@ -1840,4 +2343,251 @@ async def create_user_requirement(
except Exception as e:
db.rollback()
- raise HTTPException(status_code=500, detail=f"์๊ตฌ์ฌํญ ์์ฑ ์คํจ: {str(e)}")
\ No newline at end of file
+ raise HTTPException(status_code=500, detail=f"์๊ตฌ์ฌํญ ์์ฑ ์คํจ: {str(e)}")
+
+@router.post("/materials/{material_id}/verify")
+async def verify_material_classification(
+ material_id: int,
+ request: Request,
+ verified_category: Optional[str] = None,
+ db: Session = Depends(get_db),
+ current_user: dict = Depends(get_current_user)
+):
+ """
+ ์์ฌ ๋ถ๋ฅ ๊ฒฐ๊ณผ ๊ฒ์ฆ
+ """
+ try:
+ username = current_user.get('username', 'unknown')
+
+ # ์์ฌ ์กด์ฌ ํ์ธ
+ material_query = text("SELECT * FROM materials WHERE id = :material_id")
+ material_result = db.execute(material_query, {"material_id": material_id})
+ material = material_result.fetchone()
+
+ if not material:
+ raise HTTPException(status_code=404, detail="์์ฌ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
+
+ # ๊ฒ์ฆ ์ ๋ณด ์
๋ฐ์ดํธ
+ update_query = text("""
+ UPDATE materials
+ SET is_verified = TRUE,
+ verified_by = :username,
+ verified_at = CURRENT_TIMESTAMP,
+ classified_category = COALESCE(:verified_category, classified_category)
+ WHERE id = :material_id
+ """)
+
+ db.execute(update_query, {
+ "material_id": material_id,
+ "username": username,
+ "verified_category": verified_category
+ })
+
+ # ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+ try:
+ from ..services.activity_logger import log_activity_from_request
+ log_activity_from_request(
+ db, request, username,
+ "MATERIAL_VERIFY",
+ f"์์ฌ ๋ถ๋ฅ ๊ฒ์ฆ: {material.original_description}"
+ )
+ except Exception as e:
+ print(f"ํ๋ ๋ก๊ทธ ๊ธฐ๋ก ์คํจ: {str(e)}")
+
+ db.commit()
+
+ return {
+ "success": True,
+ "message": "์์ฌ ๋ถ๋ฅ๊ฐ ๊ฒ์ฆ๋์์ต๋๋ค",
+ "material_id": material_id,
+ "verified_by": username
+ }
+
+ except Exception as e:
+ db.rollback()
+ raise HTTPException(status_code=500, detail=f"์์ฌ ๊ฒ์ฆ ์คํจ: {str(e)}")
+
+@router.put("/materials/{material_id}/update-classification")
+async def update_material_classification(
+ material_id: int,
+ request: Request,
+ classified_category: str = Form(...),
+ classified_subcategory: str = Form(None),
+ material_grade: str = Form(None),
+ schedule: str = Form(None),
+ size_spec: str = Form(None),
+ db: Session = Depends(get_db),
+ current_user: dict = Depends(get_current_user)
+):
+ """BOM ๊ด๋ฆฌ ํ์ด์ง์์ ์ฌ์ฉ์๊ฐ ๋ถ๋ฅ๋ฅผ ์์ ํ๋ API"""
+ try:
+ username = current_user.get("username", "unknown")
+
+ # ์์ฌ ์กด์ฌ ํ์ธ
+ check_query = text("SELECT id, original_description FROM materials WHERE id = :material_id")
+ result = db.execute(check_query, {"material_id": material_id})
+ material = result.fetchone()
+
+ if not material:
+ raise HTTPException(status_code=404, detail="์์ฌ๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค")
+
+ # ๋ถ๋ฅ ์ ๋ณด ์
๋ฐ์ดํธ
+ update_query = text("""
+ UPDATE materials
+ SET classified_category = :classified_category,
+ classified_subcategory = :classified_subcategory,
+ material_grade = :material_grade,
+ schedule = :schedule,
+ size_spec = :size_spec,
+ is_verified = true,
+ verified_by = :verified_by,
+ verified_at = NOW(),
+ updated_at = NOW()
+ WHERE id = :material_id
+ """)
+
+ db.execute(update_query, {
+ "material_id": material_id,
+ "classified_category": classified_category,
+ "classified_subcategory": classified_subcategory or "",
+ "material_grade": material_grade or "",
+ "schedule": schedule or "",
+ "size_spec": size_spec or "",
+ "verified_by": username
+ })
+
+ db.commit()
+
+ # ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+ await log_activity_from_request(
+ request,
+ db,
+ "material_classification_update",
+ f"์์ฌ ๋ถ๋ฅ ์์ : {material.original_description} -> {classified_category}",
+ {"material_id": material_id, "category": classified_category}
+ )
+
+ return {
+ "success": True,
+ "message": "์์ฌ ๋ถ๋ฅ๊ฐ ์ฑ๊ณต์ ์ผ๋ก ์
๋ฐ์ดํธ๋์์ต๋๋ค",
+ "material_id": material_id,
+ "classified_category": classified_category
+ }
+
+ except Exception as e:
+ db.rollback()
+ print(f"์์ฌ ๋ถ๋ฅ ์
๋ฐ์ดํธ ์คํจ: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"์์ฌ ๋ถ๋ฅ ์
๋ฐ์ดํธ ์คํจ: {str(e)}")
+
+@router.post("/materials/confirm-purchase")
+async def confirm_material_purchase_api(
+ request: Request,
+ job_no: str = Query(...),
+ revision: str = Query(...),
+ confirmed_by: str = Query("user"),
+ confirmations_data: List[Dict] = Body(...),
+ db: Session = Depends(get_db),
+ current_user: dict = Depends(get_current_user)
+):
+ """์์ฌ ๊ตฌ๋งค์๋ ํ์ API (ํ๋ก ํธ์๋ ํธํ)"""
+ try:
+
+ # ์
๋ ฅ ๋ฐ์ดํฐ ๊ฒ์ฆ
+ if not job_no or not revision:
+ raise HTTPException(status_code=400, detail="Job ๋ฒํธ์ ๋ฆฌ๋น์ ์ ํ์์
๋๋ค")
+
+ if not confirmations_data:
+ raise HTTPException(status_code=400, detail="ํ์ ํ ์์ฌ๊ฐ ์์ต๋๋ค")
+
+ # ๊ฐ ํ์ ํญ๋ชฉ ๊ฒ์ฆ
+ for i, confirmation in enumerate(confirmations_data):
+ if not confirmation.get("material_hash"):
+ raise HTTPException(status_code=400, detail=f"{i+1}๋ฒ์งธ ํญ๋ชฉ์ material_hash๊ฐ ์์ต๋๋ค")
+
+ confirmed_qty = confirmation.get("confirmed_quantity")
+ if confirmed_qty is None or confirmed_qty < 0:
+ raise HTTPException(status_code=400, detail=f"{i+1}๋ฒ์งธ ํญ๋ชฉ์ ํ์ ์๋์ด ์ ํจํ์ง ์์ต๋๋ค")
+
+ confirmed_items = []
+
+ for confirmation in confirmations_data:
+ # ๋ฐ์ฃผ ์ถ์ ํ
์ด๋ธ์ ์ ์ฅ/์
๋ฐ์ดํธ
+ upsert_query = text("""
+ INSERT INTO material_purchase_tracking (
+ job_no, material_hash, revision, description, size_spec, unit,
+ bom_quantity, calculated_quantity, confirmed_quantity,
+ purchase_status, supplier_name, unit_price, total_price,
+ confirmed_by, confirmed_at
+ )
+ SELECT
+ :job_no, m.material_hash, :revision, m.original_description,
+ m.size_spec, m.unit, m.quantity, :calculated_qty, :confirmed_qty,
+ 'CONFIRMED', :supplier_name, :unit_price, :total_price,
+ :confirmed_by, CURRENT_TIMESTAMP
+ FROM materials m
+ WHERE m.material_hash = :material_hash
+ AND m.file_id = (
+ SELECT id FROM files
+ WHERE job_no = :job_no AND revision = :revision
+ ORDER BY upload_date DESC LIMIT 1
+ )
+ LIMIT 1
+ ON CONFLICT (job_no, material_hash, revision)
+ DO UPDATE SET
+ confirmed_quantity = :confirmed_qty,
+ purchase_status = 'CONFIRMED',
+ supplier_name = :supplier_name,
+ unit_price = :unit_price,
+ total_price = :total_price,
+ confirmed_by = :confirmed_by,
+ confirmed_at = CURRENT_TIMESTAMP,
+ updated_at = CURRENT_TIMESTAMP
+ RETURNING id, description, confirmed_quantity
+ """)
+
+ calculated_qty = confirmation.get("calculated_quantity", confirmation["confirmed_quantity"])
+ total_price = confirmation["confirmed_quantity"] * confirmation.get("unit_price", 0)
+
+ result = db.execute(upsert_query, {
+ "job_no": job_no,
+ "revision": revision,
+ "material_hash": confirmation["material_hash"],
+ "calculated_qty": calculated_qty,
+ "confirmed_qty": confirmation["confirmed_quantity"],
+ "supplier_name": confirmation.get("supplier_name", ""),
+ "unit_price": confirmation.get("unit_price", 0),
+ "total_price": total_price,
+ "confirmed_by": confirmed_by
+ })
+
+ confirmed_item = result.fetchone()
+ if confirmed_item:
+ confirmed_items.append({
+ "id": confirmed_item[0],
+ "description": confirmed_item[1],
+ "confirmed_quantity": confirmed_item[2]
+ })
+
+ db.commit()
+
+ # ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+ await log_activity_from_request(
+ request,
+ db,
+ "material_purchase_confirm",
+ f"๊ตฌ๋งค์๋ ํ์ : {job_no} {revision} - {len(confirmed_items)}๊ฐ ํ๋ชฉ",
+ {"job_no": job_no, "revision": revision, "items_count": len(confirmed_items)}
+ )
+
+ return {
+ "success": True,
+ "message": f"{len(confirmed_items)}๊ฐ ํ๋ชฉ์ ๊ตฌ๋งค์๋์ด ํ์ ๋์์ต๋๋ค",
+ "job_no": job_no,
+ "revision": revision,
+ "confirmed_items": confirmed_items
+ }
+
+ except Exception as e:
+ db.rollback()
+ print(f"๊ตฌ๋งค์๋ ํ์ ์คํจ: {str(e)}")
+ raise HTTPException(status_code=500, detail=f"๊ตฌ๋งค์๋ ํ์ ์คํจ: {str(e)}")
\ No newline at end of file
diff --git a/backend/app/routers/files.py.backup2 b/backend/app/routers/files.py.backup2
deleted file mode 100644
index 3af406b..0000000
--- a/backend/app/routers/files.py.backup2
+++ /dev/null
@@ -1,399 +0,0 @@
-from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form
-from sqlalchemy.orm import Session
-from sqlalchemy import text
-from typing import List, Optional
-import os
-import shutil
-from datetime import datetime
-import uuid
-import pandas as pd
-import re
-from pathlib import Path
-
-from ..database import get_db
-
-router = APIRouter()
-
-UPLOAD_DIR = Path("uploads")
-UPLOAD_DIR.mkdir(exist_ok=True)
-ALLOWED_EXTENSIONS = {".xlsx", ".xls", ".csv"}
-
-@router.get("/")
-async def get_files_info():
- return {
- "message": "ํ์ผ ๊ด๋ฆฌ API",
- "allowed_extensions": list(ALLOWED_EXTENSIONS),
- "upload_directory": str(UPLOAD_DIR)
- }
-
-@router.get("/test")
-async def test_endpoint():
- return {"status": "ํ์ผ API๊ฐ ์ ์ ์๋ํฉ๋๋ค!"}
-
-@router.post("/add-missing-columns")
-async def add_missing_columns(db: Session = Depends(get_db)):
- """๋๋ฝ๋ ์ปฌ๋ผ๋ค ์ถ๊ฐ"""
- try:
- db.execute(text("ALTER TABLE files ADD COLUMN IF NOT EXISTS parsed_count INTEGER DEFAULT 0"))
- db.execute(text("ALTER TABLE materials ADD COLUMN IF NOT EXISTS row_number INTEGER"))
- db.commit()
-
- return {
- "success": True,
- "message": "๋๋ฝ๋ ์ปฌ๋ผ๋ค์ด ์ถ๊ฐ๋์์ต๋๋ค",
- "added_columns": ["files.parsed_count", "materials.row_number"]
- }
- except Exception as e:
- db.rollback()
- return {"success": False, "error": f"์ปฌ๋ผ ์ถ๊ฐ ์คํจ: {str(e)}"}
-
-def validate_file_extension(filename: str) -> bool:
- return Path(filename).suffix.lower() in ALLOWED_EXTENSIONS
-
-def generate_unique_filename(original_filename: str) -> str:
- timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
- unique_id = str(uuid.uuid4())[:8]
- stem = Path(original_filename).stem
- suffix = Path(original_filename).suffix
- return f"{stem}_{timestamp}_{unique_id}{suffix}"
-
-def parse_dataframe(df):
- df = df.dropna(how='all')
- df.columns = df.columns.str.strip().str.lower()
-
- column_mapping = {
- 'description': ['description', 'item', 'material', 'ํ๋ช
', '์์ฌ๋ช
'],
- 'quantity': ['qty', 'quantity', 'ea', '์๋'],
- 'main_size': ['main_nom', 'nominal_diameter', 'nd', '์ฃผ๋ฐฐ๊ด'],
- 'red_size': ['red_nom', 'reduced_diameter', '์ถ์๋ฐฐ๊ด'],
- 'length': ['length', 'len', '๊ธธ์ด'],
- 'weight': ['weight', 'wt', '์ค๋'],
- 'dwg_name': ['dwg_name', 'drawing', '๋๋ฉด๋ช
'],
- 'line_num': ['line_num', 'line_number', '๋ผ์ธ๋ฒํธ']
- }
-
- mapped_columns = {}
- for standard_col, possible_names in column_mapping.items():
- for possible_name in possible_names:
- if possible_name in df.columns:
- mapped_columns[standard_col] = possible_name
- break
-
- materials = []
- for index, row in df.iterrows():
- description = str(row.get(mapped_columns.get('description', ''), ''))
- quantity_raw = row.get(mapped_columns.get('quantity', ''), 0)
-
- try:
- quantity = float(quantity_raw) if pd.notna(quantity_raw) else 0
- except:
- quantity = 0
-
- material_grade = ""
- if "ASTM" in description.upper():
- astm_match = re.search(r'ASTM\s+([A-Z0-9\s]+)', description.upper())
- if astm_match:
- material_grade = astm_match.group(0).strip()
-
- main_size = str(row.get(mapped_columns.get('main_size', ''), ''))
- red_size = str(row.get(mapped_columns.get('red_size', ''), ''))
-
- if main_size != 'nan' and red_size != 'nan' and red_size != '':
- size_spec = f"{main_size} x {red_size}"
- elif main_size != 'nan' and main_size != '':
- size_spec = main_size
- else:
- size_spec = ""
-
- if description and description not in ['nan', 'None', '']:
- materials.append({
- 'original_description': description,
- 'quantity': quantity,
- 'unit': "EA",
- 'size_spec': size_spec,
- 'material_grade': material_grade,
- 'line_number': index + 1,
- 'row_number': index + 1
- })
-
- return materials
-
-def parse_file_data(file_path):
- file_extension = Path(file_path).suffix.lower()
-
- try:
- if file_extension == ".csv":
- df = pd.read_csv(file_path, encoding='utf-8')
- elif file_extension in [".xlsx", ".xls"]:
- df = pd.read_excel(file_path, sheet_name=0)
- else:
- raise HTTPException(status_code=400, detail="์ง์ํ์ง ์๋ ํ์ผ ํ์")
-
- return parse_dataframe(df)
- except Exception as e:
- raise HTTPException(status_code=400, detail=f"ํ์ผ ํ์ฑ ์คํจ: {str(e)}")
-
-@router.post("/upload")
-async def upload_file(
- file: UploadFile = File(...),
- job_no: str = Form(...),
- revision: str = Form("Rev.0"),
- db: Session = Depends(get_db)
-):
- if not validate_file_extension(file.filename):
- raise HTTPException(
- status_code=400,
- detail=f"์ง์ํ์ง ์๋ ํ์ผ ํ์์
๋๋ค. ํ์ฉ๋ ํ์ฅ์: {', '.join(ALLOWED_EXTENSIONS)}"
- )
-
- if file.size and file.size > 10 * 1024 * 1024:
- raise HTTPException(status_code=400, detail="ํ์ผ ํฌ๊ธฐ๋ 10MB๋ฅผ ์ด๊ณผํ ์ ์์ต๋๋ค")
-
- unique_filename = generate_unique_filename(file.filename)
- file_path = UPLOAD_DIR / unique_filename
-
- try:
- with open(file_path, "wb") as buffer:
- shutil.copyfileobj(file.file, buffer)
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ ์ฅ ์คํจ: {str(e)}")
-
- try:
- materials_data = parse_file_data(str(file_path))
- parsed_count = len(materials_data)
-
- # ํ์ผ ์ ๋ณด ์ ์ฅ
- file_insert_query = text("""
- INSERT INTO files (filename, original_filename, file_path, job_no, revision, description, file_size, parsed_count, is_active)
- VALUES (:filename, :original_filename, :file_path, :job_no, :revision, :description, :file_size, :parsed_count, :is_active)
- RETURNING id
- """)
-
- file_result = db.execute(file_insert_query, {
- "filename": unique_filename,
- "original_filename": file.filename,
- "file_path": str(file_path),
- "job_no": job_no,
- "revision": revision,
- "description": f"BOM ํ์ผ - {parsed_count}๊ฐ ์์ฌ",
- "file_size": file.size,
- "parsed_count": parsed_count,
- "is_active": True
- })
-
- file_id = file_result.fetchone()[0]
-
- # ์์ฌ ๋ฐ์ดํฐ ์ ์ฅ
- materials_inserted = 0
- for material_data in materials_data:
- material_insert_query = text("""
- INSERT INTO materials (
- file_id, original_description, quantity, unit, size_spec,
- material_grade, line_number, row_number, classified_category,
- classification_confidence, is_verified, created_at
- )
- VALUES (
- :file_id, :original_description, :quantity, :unit, :size_spec,
- :material_grade, :line_number, :row_number, :classified_category,
- :classification_confidence, :is_verified, :created_at
- )
- """)
-
- db.execute(material_insert_query, {
- "file_id": file_id,
- "original_description": material_data["original_description"],
- "quantity": material_data["quantity"],
- "unit": material_data["unit"],
- "size_spec": material_data["size_spec"],
- "material_grade": material_data["material_grade"],
- "line_number": material_data["line_number"],
- "row_number": material_data["row_number"],
- "classified_category": None,
- "classification_confidence": None,
- "is_verified": False,
- "created_at": datetime.now()
- })
- materials_inserted += 1
-
- db.commit()
-
- return {
- "success": True,
- "message": f"์์ ํ DB ์ ์ฅ ์ฑ๊ณต! {materials_inserted}๊ฐ ์์ฌ ์ ์ฅ๋จ",
- "original_filename": file.filename,
- "file_id": file_id,
- "parsed_materials_count": parsed_count,
- "saved_materials_count": materials_inserted,
- "sample_materials": materials_data[:3] if materials_data else []
- }
-
- except Exception as e:
- db.rollback()
- if os.path.exists(file_path):
- os.remove(file_path)
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ฒ๋ฆฌ ์คํจ: {str(e)}")
-@router.get("/materials")
-async def get_materials(
- job_no: Optional[str] = None,
- file_id: Optional[str] = None,
- skip: int = 0,
- limit: int = 100,
- db: Session = Depends(get_db)
-):
- """์ ์ฅ๋ ์์ฌ ๋ชฉ๋ก ์กฐํ"""
- try:
- query = """
- SELECT m.id, m.file_id, m.original_description, m.quantity, m.unit,
- m.size_spec, m.material_grade, m.line_number, m.row_number,
- m.created_at,
- f.original_filename, f.job_no,
- j.job_no, j.job_name
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- LEFT JOIN jobs j ON f.job_no = j.job_no
- WHERE 1=1
- """
-
- params = {}
-
- if job_no:
- query += " AND f.job_no = :job_no"
- params["job_no"] = job_no
-
- if file_id:
- query += " AND m.file_id = :file_id"
- params["file_id"] = file_id
-
- query += " ORDER BY m.line_number ASC LIMIT :limit OFFSET :skip"
- params["limit"] = limit
- params["skip"] = skip
-
- result = db.execute(text(query), params)
- materials = result.fetchall()
-
- # ์ ์ฒด ๊ฐ์ ์กฐํ
- count_query = """
- SELECT COUNT(*) as total
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- WHERE 1=1
- """
- count_params = {}
-
- if job_no:
- count_query += " AND f.job_no = :job_no"
- count_params["job_no"] = job_no
-
- if file_id:
- count_query += " AND m.file_id = :file_id"
- count_params["file_id"] = file_id
-
- count_result = db.execute(text(count_query), count_params)
- total_count = count_result.fetchone()[0]
-
- return {
- "success": True,
- "total_count": total_count,
- "returned_count": len(materials),
- "skip": skip,
- "limit": limit,
- "materials": [
- {
- "id": m.id,
- "file_id": m.file_id,
- "filename": m.original_filename,
- "job_no": m.job_no,
- "project_code": m.official_project_code,
- "project_name": m.project_name,
- "original_description": m.original_description,
- "quantity": float(m.quantity) if m.quantity else 0,
- "unit": m.unit,
- "size_spec": m.size_spec,
- "material_grade": m.material_grade,
- "line_number": m.line_number,
- "row_number": m.row_number,
- "created_at": m.created_at
- }
- for m in materials
- ]
- }
-
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"์์ฌ ์กฐํ ์คํจ: {str(e)}")
-
-@router.get("/materials/summary")
-async def get_materials_summary(
- job_no: Optional[str] = None,
- file_id: Optional[str] = None,
- db: Session = Depends(get_db)
-):
- """์์ฌ ์์ฝ ํต๊ณ"""
- try:
- query = """
- SELECT
- COUNT(*) as total_items,
- COUNT(DISTINCT m.original_description) as unique_descriptions,
- COUNT(DISTINCT m.size_spec) as unique_sizes,
- COUNT(DISTINCT m.material_grade) as unique_materials,
- SUM(m.quantity) as total_quantity,
- AVG(m.quantity) as avg_quantity,
- MIN(m.created_at) as earliest_upload,
- MAX(m.created_at) as latest_upload
- FROM materials m
- LEFT JOIN files f ON m.file_id = f.id
- WHERE 1=1
- """
-
- params = {}
-
- if job_no:
- query += " AND f.job_no = :job_no"
- params["job_no"] = job_no
-
- if file_id:
- query += " AND m.file_id = :file_id"
- params["file_id"] = file_id
-
- result = db.execute(text(query), params)
- summary = result.fetchone()
-
- return {
- "success": True,
- "summary": {
- "total_items": summary.total_items,
- "unique_descriptions": summary.unique_descriptions,
- "unique_sizes": summary.unique_sizes,
- "unique_materials": summary.unique_materials,
- "total_quantity": float(summary.total_quantity) if summary.total_quantity else 0,
- "avg_quantity": round(float(summary.avg_quantity), 2) if summary.avg_quantity else 0,
- "earliest_upload": summary.earliest_upload,
- "latest_upload": summary.latest_upload
- }
- }
-
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"์์ฝ ์กฐํ ์คํจ: {str(e)}")
-# Job ๊ฒ์ฆ ํจ์ (ํ์ผ ๋์ ์ถ๊ฐํ ์์ )
-async def validate_job_exists(job_no: str, db: Session):
- """Job ์กด์ฌ ์ฌ๋ถ ๋ฐ ํ์ฑ ์ํ ํ์ธ"""
- try:
- query = text("SELECT job_no, job_name, status FROM jobs WHERE job_no = :job_no AND is_active = true")
- job = db.execute(query, {"job_no": job_no}).fetchone()
-
- if not job:
- return {"valid": False, "error": f"Job No. '{job_no}'๋ฅผ ์ฐพ์ ์ ์์ต๋๋ค"}
-
- if job.status == '์๋ฃ':
- return {"valid": False, "error": f"์๋ฃ๋ Job '{job.job_name}'์๋ ํ์ผ์ ์
๋ก๋ํ ์ ์์ต๋๋ค"}
-
- return {
- "valid": True,
- "job": {
- "job_no": job.job_no,
- "job_name": job.job_name,
- "status": job.status
- }
- }
-
- except Exception as e:
- return {"valid": False, "error": f"Job ๊ฒ์ฆ ์คํจ: {str(e)}"}
diff --git a/backend/app/routers/material_comparison.py b/backend/app/routers/material_comparison.py
index 110ca0d..008808a 100644
--- a/backend/app/routers/material_comparison.py
+++ b/backend/app/routers/material_comparison.py
@@ -157,6 +157,26 @@ async def confirm_material_purchase(
]
"""
try:
+ # ์
๋ ฅ ๋ฐ์ดํฐ ๊ฒ์ฆ
+ if not job_no or not revision:
+ raise HTTPException(status_code=400, detail="Job ๋ฒํธ์ ๋ฆฌ๋น์ ์ ํ์์
๋๋ค")
+
+ if not confirmations:
+ raise HTTPException(status_code=400, detail="ํ์ ํ ์์ฌ๊ฐ ์์ต๋๋ค")
+
+ # ๊ฐ ํ์ ํญ๋ชฉ ๊ฒ์ฆ
+ for i, confirmation in enumerate(confirmations):
+ if not confirmation.get("material_hash"):
+ raise HTTPException(status_code=400, detail=f"{i+1}๋ฒ์งธ ํญ๋ชฉ์ material_hash๊ฐ ์์ต๋๋ค")
+
+ confirmed_qty = confirmation.get("confirmed_quantity")
+ if confirmed_qty is None or confirmed_qty < 0:
+ raise HTTPException(status_code=400, detail=f"{i+1}๋ฒ์งธ ํญ๋ชฉ์ ํ์ ์๋์ด ์ ํจํ์ง ์์ต๋๋ค")
+
+ unit_price = confirmation.get("unit_price", 0)
+ if unit_price < 0:
+ raise HTTPException(status_code=400, detail=f"{i+1}๋ฒ์งธ ํญ๋ชฉ์ ๋จ๊ฐ๊ฐ ์ ํจํ์ง ์์ต๋๋ค")
+
confirmed_items = []
for confirmation in confirmations:
@@ -470,7 +490,7 @@ async def get_materials_by_hash(db: Session, file_id: int) -> Dict[str, Dict]:
"""ํ์ผ์ ์์ฌ๋ฅผ ํด์๋ณ๋ก ๊ทธ๋ฃนํํ์ฌ ์กฐํ"""
import hashlib
- print(f"๐จ๐จ๐จ get_materials_by_hash ํธ์ถ๋จ! file_id={file_id} ๐จ๐จ๐จ")
+ # ๋ก๊ทธ ์ ๊ฑฐ
query = text("""
SELECT
@@ -492,11 +512,7 @@ async def get_materials_by_hash(db: Session, file_id: int) -> Dict[str, Dict]:
result = db.execute(query, {"file_id": file_id})
materials = result.fetchall()
- print(f"๐ ์ฟผ๋ฆฌ ๊ฒฐ๊ณผ ๊ฐ์: {len(materials)}")
- if len(materials) > 0:
- print(f"๐ ์ฒซ ๋ฒ์งธ ์๋ฃ ์ํ: {materials[0]}")
- else:
- print(f"โ ์๋ฃ๊ฐ ์์! file_id={file_id}")
+ # ๋ก๊ทธ ์ ๊ฑฐ
# ๐ ๊ฐ์ ํ์ดํ๋ค์ Python์์ ์ฌ๋ฐ๋ฅด๊ฒ ๊ทธ๋ฃนํ
materials_dict = {}
@@ -505,38 +521,41 @@ async def get_materials_by_hash(db: Session, file_id: int) -> Dict[str, Dict]:
hash_source = f"{mat[1] or ''}|{mat[2] or ''}|{mat[3] or ''}"
material_hash = hashlib.md5(hash_source.encode()).hexdigest()
- print(f"๐ ๊ฐ๋ณ ์์ฌ: {mat[1][:50]}... ({mat[2]}) - ์๋: {mat[4]}, ๊ธธ์ด: {mat[7]}mm")
+ # ๊ฐ๋ณ ์์ฌ ๋ก๊ทธ ์ ๊ฑฐ (๋๋ฌด ๋ง์)
if material_hash in materials_dict:
# ๐ ๊ธฐ์กด ํญ๋ชฉ์ ์๋ ํฉ๊ณ
existing = materials_dict[material_hash]
- existing["quantity"] += float(mat[4]) if mat[4] else 0.0
+ # ํ์ดํ๊ฐ ์๋ ๊ฒฝ์ฐ๋ง quantity ํฉ์ฐ (ํ์ดํ๋ ๊ฐ๋ณ ๊ธธ์ด๊ฐ ๋ค๋ฅด๋ฏ๋ก ํฉ์ฐํ์ง ์์)
+ if mat[5] != 'PIPE':
+ existing["quantity"] += float(mat[4]) if mat[4] else 0.0
existing["line_number"] += f", {mat[8]}" if mat[8] else ""
# ํ์ดํ์ธ ๊ฒฝ์ฐ ๊ธธ์ด ์ ๋ณด ํฉ์ฐ
if mat[5] == 'PIPE' and mat[7] is not None:
if "pipe_details" in existing:
- # ์ด๊ธธ์ด ํฉ์ฐ: ๊ธฐ์กด ์ด๊ธธ์ด + (ํ์ฌ ์๋ ร ํ์ฌ ๊ธธ์ด)
+ # ์ด๊ธธ์ด ํฉ์ฐ: ๊ธฐ์กด ์ด๊ธธ์ด + ํ์ฌ ํ์ดํ์ ์ค์ ๊ธธ์ด (DB์ ์ ์ฅ๋ ๊ฐ๋ณ ๊ธธ์ด)
current_total = existing["pipe_details"]["total_length_mm"]
current_count = existing["pipe_details"]["pipe_count"]
- new_length = float(mat[4]) * float(mat[7]) # ์๋ ร ๋จ์๊ธธ์ด
- existing["pipe_details"]["total_length_mm"] = current_total + new_length
- existing["pipe_details"]["pipe_count"] = current_count + float(mat[4])
+ # โ
DB์์ ๊ฐ์ ธ์จ length_mm๋ ์ด๋ฏธ ๊ฐ๋ณ ํ์ดํ์ ์ค์ ๊ธธ์ด์ด๋ฏ๋ก ์๋์ ๊ณฑํ์ง ์์
+ individual_length = float(mat[7]) # ๊ฐ๋ณ ํ์ดํ์ ์ค์ ๊ธธ์ด
+ existing["pipe_details"]["total_length_mm"] = current_total + individual_length
+ existing["pipe_details"]["pipe_count"] = current_count + 1 # ํ์ดํ ๊ฐ์๋ 1๊ฐ์ฉ ์ฆ๊ฐ
# ํ๊ท ๋จ์ ๊ธธ์ด ์ฌ๊ณ์ฐ
total_length = existing["pipe_details"]["total_length_mm"]
total_count = existing["pipe_details"]["pipe_count"]
existing["pipe_details"]["length_mm"] = total_length / total_count
- print(f"๐ ํ์ดํ ํฉ์ฐ: {mat[1]} ({mat[2]}) - ์ด๊ธธ์ด: {total_length}mm, ์ด๊ฐ์: {total_count}๊ฐ, ํ๊ท : {total_length/total_count:.1f}mm")
+ # ํ์ดํ ํฉ์ฐ ๋ก๊ทธ ์ ๊ฑฐ (๋๋ฌด ๋ง์)
else:
# ์ฒซ ํ์ดํ ์ ๋ณด ์ค์
- pipe_length = float(mat[4]) * float(mat[7])
+ individual_length = float(mat[7]) # ๊ฐ๋ณ ํ์ดํ์ ์ค์ ๊ธธ์ด
existing["pipe_details"] = {
- "length_mm": float(mat[7]),
- "total_length_mm": pipe_length,
- "pipe_count": float(mat[4])
+ "length_mm": individual_length,
+ "total_length_mm": individual_length, # ์ฒซ ๋ฒ์งธ ํ์ดํ์ด๋ฏ๋ก ๊ฐ๋ณ ๊ธธ์ด์ ๋์ผ
+ "pipe_count": 1 # ์ฒซ ๋ฒ์งธ ํ์ดํ์ด๋ฏ๋ก 1๊ฐ
}
else:
# ๐ ์ ํญ๋ชฉ ์์ฑ
@@ -553,27 +572,22 @@ async def get_materials_by_hash(db: Session, file_id: int) -> Dict[str, Dict]:
# ํ์ดํ์ธ ๊ฒฝ์ฐ pipe_details ์ ๋ณด ์ถ๊ฐ
if mat[5] == 'PIPE' and mat[7] is not None:
- pipe_length = float(mat[4]) * float(mat[7]) # ์๋ ร ๋จ์๊ธธ์ด
+ individual_length = float(mat[7]) # ๊ฐ๋ณ ํ์ดํ์ ์ค์ ๊ธธ์ด
material_data["pipe_details"] = {
- "length_mm": float(mat[7]), # ๋จ์ ๊ธธ์ด
- "total_length_mm": pipe_length, # ์ด ๊ธธ์ด
- "pipe_count": float(mat[4]) # ํ์ดํ ๊ฐ์
+ "length_mm": individual_length, # ๊ฐ๋ณ ํ์ดํ ๊ธธ์ด
+ "total_length_mm": individual_length, # ์ฒซ ๋ฒ์งธ ํ์ดํ์ด๋ฏ๋ก ๊ฐ๋ณ ๊ธธ์ด์ ๋์ผ
+ "pipe_count": 1 # ์ฒซ ๋ฒ์งธ ํ์ดํ์ด๋ฏ๋ก 1๊ฐ
}
- print(f"๐ ํ์ดํ ์ ๊ท: {mat[1]} ({mat[2]}) - ๋จ์: {mat[7]}mm, ์ด๊ธธ์ด: {pipe_length}mm")
+ # ํ์ดํ๋ quantity๋ฅผ 1๋ก ์ค์ (pipe_count์ ๋์ผ)
+ material_data["quantity"] = 1
materials_dict[material_hash] = material_data
- # ํ์ดํ ๋ฐ์ดํฐ๊ฐ ํฌํจ๋์๋์ง ํ์ธ
+ # ํ์ดํ ๋ฐ์ดํฐ ์์ฝ๋ง ์ถ๋ ฅ
pipe_count = sum(1 for data in materials_dict.values() if data.get('category') == 'PIPE')
pipe_with_details = sum(1 for data in materials_dict.values()
if data.get('category') == 'PIPE' and 'pipe_details' in data)
- print(f"๐ ๋ฐํ ๊ฒฐ๊ณผ: ์ด {len(materials_dict)}๊ฐ ์์ฌ, ํ์ดํ {pipe_count}๊ฐ, pipe_details ์๋ ํ์ดํ {pipe_with_details}๊ฐ")
-
- # ์ฒซ ๋ฒ์งธ ํ์ดํ ๋ฐ์ดํฐ ์ํ ์ถ๋ ฅ
- for hash_key, data in materials_dict.items():
- if data.get('category') == 'PIPE':
- print(f"๐ ํ์ดํ ์ํ: {data}")
- break
+ print(f"โ
์์ฌ ์ฒ๋ฆฌ ์๋ฃ: ์ด {len(materials_dict)}๊ฐ, ํ์ดํ {pipe_count}๊ฐ (๊ธธ์ด์ ๋ณด: {pipe_with_details}๊ฐ)")
return materials_dict
diff --git a/backend/app/routers/purchase.py b/backend/app/routers/purchase.py
index ef6b5fc..e271e60 100644
--- a/backend/app/routers/purchase.py
+++ b/backend/app/routers/purchase.py
@@ -5,11 +5,13 @@
- ๋ฆฌ๋น์ ๋น๊ต
"""
-from fastapi import APIRouter, Depends, HTTPException, Query
+from fastapi import APIRouter, Depends, HTTPException, Query, Request
from sqlalchemy.orm import Session
from sqlalchemy import text
from typing import List, Optional
+from pydantic import BaseModel
import json
+from datetime import datetime
from ..database import get_db
from ..services.purchase_calculator import (
@@ -21,6 +23,28 @@ from ..services.purchase_calculator import (
router = APIRouter(prefix="/purchase", tags=["purchase"])
+# Pydantic ๋ชจ๋ธ (์ต์ ํ๋ ๊ตฌ์กฐ)
+class PurchaseItemMinimal(BaseModel):
+ """๊ตฌ๋งค ํ์ ์ฉ ์ต์ ํ์ ๋ฐ์ดํฐ"""
+ item_code: str
+ category: str
+ specification: str
+ size: str = ""
+ material: str = ""
+ bom_quantity: float
+ calculated_qty: float
+ unit: str = "EA"
+ safety_factor: float = 1.0
+
+class PurchaseConfirmRequest(BaseModel):
+ job_no: str
+ file_id: int
+ bom_name: Optional[str] = None # ์ ํ์ ํ๋๋ก ๋ณ๊ฒฝ
+ revision: str
+ purchase_items: List[PurchaseItemMinimal] # ์ต์ ํ๋ ๊ตฌ์กฐ ์ฌ์ฉ
+ confirmed_at: str
+ confirmed_by: str
+
@router.get("/items/calculate")
async def calculate_purchase_items(
job_no: str = Query(..., description="Job ๋ฒํธ"),
@@ -39,7 +63,7 @@ async def calculate_purchase_items(
file_query = text("""
SELECT id FROM files
WHERE job_no = :job_no AND revision = :revision AND is_active = TRUE
- ORDER BY created_at DESC
+ ORDER BY updated_at DESC
LIMIT 1
""")
file_result = db.execute(file_query, {"job_no": job_no, "revision": revision}).fetchone()
@@ -62,6 +86,139 @@ async def calculate_purchase_items(
except Exception as e:
raise HTTPException(status_code=500, detail=f"๊ตฌ๋งค ํ๋ชฉ ๊ณ์ฐ ์คํจ: {str(e)}")
+@router.post("/confirm")
+async def confirm_purchase_quantities(
+ request: PurchaseConfirmRequest,
+ db: Session = Depends(get_db)
+):
+ """
+ ๊ตฌ๋งค ์๋ ํ์
+ - ๊ณ์ฐ๋ ๊ตฌ๋งค ์๋์ ํ์ ์ํ๋ก ์ ์ฅ
+ - ์์ฌ๋ณ ํ์ ์๋ ๋ฐ ์ํ ์
๋ฐ์ดํธ
+ - ๋ฆฌ๋น์ ๋น๊ต๋ฅผ ์ํ ๊ธฐ์ค ๋ฐ์ดํฐ ์์ฑ
+ """
+ try:
+ # 1. ๊ธฐ์กด ํ์ ๋ฐ์ดํฐ ํ์ธ ๋ฐ ์
๋ฐ์ดํธ ๋๋ ์ฝ์
+ existing_query = text("""
+ SELECT id FROM purchase_confirmations
+ WHERE file_id = :file_id
+ """)
+ existing_result = db.execute(existing_query, {"file_id": request.file_id}).fetchone()
+
+ if existing_result:
+ # ๊ธฐ์กด ๋ฐ์ดํฐ ์
๋ฐ์ดํธ
+ confirmation_id = existing_result[0]
+ update_query = text("""
+ UPDATE purchase_confirmations
+ SET job_no = :job_no,
+ bom_name = :bom_name,
+ revision = :revision,
+ confirmed_at = :confirmed_at,
+ confirmed_by = :confirmed_by,
+ is_active = TRUE,
+ updated_at = CURRENT_TIMESTAMP
+ WHERE id = :confirmation_id
+ """)
+ db.execute(update_query, {
+ "confirmation_id": confirmation_id,
+ "job_no": request.job_no,
+ "bom_name": request.bom_name or f"{request.job_no}_{request.revision}", # ๊ธฐ๋ณธ๊ฐ ์ ๊ณต
+ "revision": request.revision,
+ "confirmed_at": request.confirmed_at,
+ "confirmed_by": request.confirmed_by
+ })
+
+ # ๊ธฐ์กด ํ์ ํ๋ชฉ๋ค ์ญ์
+ delete_items_query = text("""
+ DELETE FROM confirmed_purchase_items
+ WHERE confirmation_id = :confirmation_id
+ """)
+ db.execute(delete_items_query, {"confirmation_id": confirmation_id})
+ else:
+ # ์๋ก์ด ํ์ ๋ฐ์ดํฐ ์ฝ์
+ confirm_query = text("""
+ INSERT INTO purchase_confirmations (
+ job_no, file_id, bom_name, revision,
+ confirmed_at, confirmed_by, is_active, created_at
+ ) VALUES (
+ :job_no, :file_id, :bom_name, :revision,
+ :confirmed_at, :confirmed_by, TRUE, CURRENT_TIMESTAMP
+ ) RETURNING id
+ """)
+
+ confirm_result = db.execute(confirm_query, {
+ "job_no": request.job_no,
+ "file_id": request.file_id,
+ "bom_name": request.bom_name or f"{request.job_no}_{request.revision}", # ๊ธฐ๋ณธ๊ฐ ์ ๊ณต
+ "revision": request.revision,
+ "confirmed_at": request.confirmed_at,
+ "confirmed_by": request.confirmed_by
+ })
+
+ confirmation_id = confirm_result.fetchone()[0]
+
+ # 3. ํ์ ๋ ๊ตฌ๋งค ํ๋ชฉ๋ค ์ ์ฅ
+ saved_items = 0
+ for item in request.purchase_items:
+ item_query = text("""
+ INSERT INTO confirmed_purchase_items (
+ confirmation_id, item_code, category, specification,
+ size, material, bom_quantity, calculated_qty,
+ unit, safety_factor, created_at
+ ) VALUES (
+ :confirmation_id, :item_code, :category, :specification,
+ :size, :material, :bom_quantity, :calculated_qty,
+ :unit, :safety_factor, CURRENT_TIMESTAMP
+ )
+ """)
+
+ db.execute(item_query, {
+ "confirmation_id": confirmation_id,
+ "item_code": item.item_code or f"{item.category}-{saved_items+1}",
+ "category": item.category,
+ "specification": item.specification,
+ "size": item.size or "",
+ "material": item.material or "",
+ "bom_quantity": item.bom_quantity,
+ "calculated_qty": item.calculated_qty,
+ "unit": item.unit,
+ "safety_factor": item.safety_factor
+ })
+ saved_items += 1
+
+ # 4. ํ์ผ ์ํ๋ฅผ ํ์ ์ผ๋ก ์
๋ฐ์ดํธ
+ file_update_query = text("""
+ UPDATE files
+ SET purchase_confirmed = TRUE,
+ confirmed_at = :confirmed_at,
+ confirmed_by = :confirmed_by,
+ updated_at = CURRENT_TIMESTAMP
+ WHERE id = :file_id
+ """)
+
+ db.execute(file_update_query, {
+ "file_id": request.file_id,
+ "confirmed_at": request.confirmed_at,
+ "confirmed_by": request.confirmed_by
+ })
+
+ db.commit()
+
+ return {
+ "success": True,
+ "message": "๊ตฌ๋งค ์๋์ด ์ฑ๊ณต์ ์ผ๋ก ํ์ ๋์์ต๋๋ค",
+ "confirmation_id": confirmation_id,
+ "confirmed_items": saved_items,
+ "job_no": request.job_no,
+ "revision": request.revision,
+ "confirmed_at": request.confirmed_at,
+ "confirmed_by": request.confirmed_by
+ }
+
+ except Exception as e:
+ db.rollback()
+ raise HTTPException(status_code=500, detail=f"๊ตฌ๋งค ์๋ ํ์ ์คํจ: {str(e)}")
+
@router.post("/items/save")
async def save_purchase_items(
job_no: str,
diff --git a/backend/app/services/activity_logger.py b/backend/app/services/activity_logger.py
new file mode 100644
index 0000000..7951841
--- /dev/null
+++ b/backend/app/services/activity_logger.py
@@ -0,0 +1,362 @@
+"""
+์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ์๋น์ค
+๋ชจ๋ ์
๋ฌด ํ๋์ ์ถ์ ํ๊ณ ๊ธฐ๋กํ๋ ์๋น์ค
+"""
+
+from sqlalchemy.orm import Session
+from sqlalchemy import text
+from typing import Optional, Dict, Any
+from fastapi import Request
+import json
+from datetime import datetime
+
+from ..utils.logger import get_logger
+
+logger = get_logger(__name__)
+
+
+class ActivityLogger:
+ """์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ๊ด๋ฆฌ ํด๋์ค"""
+
+ def __init__(self, db: Session):
+ self.db = db
+
+ def log_activity(
+ self,
+ username: str,
+ activity_type: str,
+ activity_description: str,
+ target_id: Optional[int] = None,
+ target_type: Optional[str] = None,
+ user_id: Optional[int] = None,
+ ip_address: Optional[str] = None,
+ user_agent: Optional[str] = None,
+ metadata: Optional[Dict[str, Any]] = None
+ ) -> int:
+ """
+ ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ๊ธฐ๋ก
+
+ Args:
+ username: ์ฌ์ฉ์๋ช
(ํ์)
+ activity_type: ํ๋ ์ ํ (FILE_UPLOAD, PROJECT_CREATE ๋ฑ)
+ activity_description: ํ๋ ์ค๋ช
+ target_id: ๋์ ID (ํ์ผ, ํ๋ก์ ํธ ๋ฑ)
+ target_type: ๋์ ์ ํ (FILE, PROJECT ๋ฑ)
+ user_id: ์ฌ์ฉ์ ID
+ ip_address: IP ์ฃผ์
+ user_agent: ๋ธ๋ผ์ฐ์ ์ ๋ณด
+ metadata: ์ถ๊ฐ ๋ฉํ๋ฐ์ดํฐ
+
+ Returns:
+ int: ์์ฑ๋ ๋ก๊ทธ ID
+ """
+ try:
+ insert_query = text("""
+ INSERT INTO user_activity_logs (
+ user_id, username, activity_type, activity_description,
+ target_id, target_type, ip_address, user_agent, metadata
+ ) VALUES (
+ :user_id, :username, :activity_type, :activity_description,
+ :target_id, :target_type, :ip_address, :user_agent, :metadata
+ ) RETURNING id
+ """)
+
+ result = self.db.execute(insert_query, {
+ 'user_id': user_id,
+ 'username': username,
+ 'activity_type': activity_type,
+ 'activity_description': activity_description,
+ 'target_id': target_id,
+ 'target_type': target_type,
+ 'ip_address': ip_address,
+ 'user_agent': user_agent,
+ 'metadata': json.dumps(metadata) if metadata else None
+ })
+
+ log_id = result.fetchone()[0]
+ self.db.commit()
+
+ logger.info(f"Activity logged: {username} - {activity_type} - {activity_description}")
+ return log_id
+
+ except Exception as e:
+ logger.error(f"Failed to log activity: {str(e)}")
+ self.db.rollback()
+ raise
+
+ def log_file_upload(
+ self,
+ username: str,
+ file_id: int,
+ filename: str,
+ file_size: int,
+ job_no: str,
+ revision: str,
+ user_id: Optional[int] = None,
+ ip_address: Optional[str] = None,
+ user_agent: Optional[str] = None
+ ) -> int:
+ """ํ์ผ ์
๋ก๋ ํ๋ ๋ก๊ทธ"""
+ metadata = {
+ 'filename': filename,
+ 'file_size': file_size,
+ 'job_no': job_no,
+ 'revision': revision,
+ 'upload_time': datetime.now().isoformat()
+ }
+
+ return self.log_activity(
+ username=username,
+ activity_type='FILE_UPLOAD',
+ activity_description=f'BOM ํ์ผ ์
๋ก๋: {filename} (Job: {job_no}, Rev: {revision})',
+ target_id=file_id,
+ target_type='FILE',
+ user_id=user_id,
+ ip_address=ip_address,
+ user_agent=user_agent,
+ metadata=metadata
+ )
+
+ def log_project_create(
+ self,
+ username: str,
+ project_id: int,
+ project_name: str,
+ job_no: str,
+ user_id: Optional[int] = None,
+ ip_address: Optional[str] = None,
+ user_agent: Optional[str] = None
+ ) -> int:
+ """ํ๋ก์ ํธ ์์ฑ ํ๋ ๋ก๊ทธ"""
+ metadata = {
+ 'project_name': project_name,
+ 'job_no': job_no,
+ 'create_time': datetime.now().isoformat()
+ }
+
+ return self.log_activity(
+ username=username,
+ activity_type='PROJECT_CREATE',
+ activity_description=f'ํ๋ก์ ํธ ์์ฑ: {project_name} ({job_no})',
+ target_id=project_id,
+ target_type='PROJECT',
+ user_id=user_id,
+ ip_address=ip_address,
+ user_agent=user_agent,
+ metadata=metadata
+ )
+
+ def log_material_classify(
+ self,
+ username: str,
+ file_id: int,
+ classified_count: int,
+ job_no: str,
+ revision: str,
+ user_id: Optional[int] = None,
+ ip_address: Optional[str] = None,
+ user_agent: Optional[str] = None
+ ) -> int:
+ """์์ฌ ๋ถ๋ฅ ํ๋ ๋ก๊ทธ"""
+ metadata = {
+ 'classified_count': classified_count,
+ 'job_no': job_no,
+ 'revision': revision,
+ 'classify_time': datetime.now().isoformat()
+ }
+
+ return self.log_activity(
+ username=username,
+ activity_type='MATERIAL_CLASSIFY',
+ activity_description=f'์์ฌ ๋ถ๋ฅ ์๋ฃ: {classified_count}๊ฐ ์์ฌ (Job: {job_no}, Rev: {revision})',
+ target_id=file_id,
+ target_type='FILE',
+ user_id=user_id,
+ ip_address=ip_address,
+ user_agent=user_agent,
+ metadata=metadata
+ )
+
+ def log_purchase_confirm(
+ self,
+ username: str,
+ job_no: str,
+ revision: str,
+ confirmed_count: int,
+ total_amount: Optional[float] = None,
+ user_id: Optional[int] = None,
+ ip_address: Optional[str] = None,
+ user_agent: Optional[str] = None
+ ) -> int:
+ """๊ตฌ๋งค ํ์ ํ๋ ๋ก๊ทธ"""
+ metadata = {
+ 'job_no': job_no,
+ 'revision': revision,
+ 'confirmed_count': confirmed_count,
+ 'total_amount': total_amount,
+ 'confirm_time': datetime.now().isoformat()
+ }
+
+ return self.log_activity(
+ username=username,
+ activity_type='PURCHASE_CONFIRM',
+ activity_description=f'๊ตฌ๋งค ํ์ : {confirmed_count}๊ฐ ํ๋ชฉ (Job: {job_no}, Rev: {revision})',
+ target_id=None, # ๊ตฌ๋งค๋ ํน์ ID๊ฐ ์์
+ target_type='PURCHASE',
+ user_id=user_id,
+ ip_address=ip_address,
+ user_agent=user_agent,
+ metadata=metadata
+ )
+
+ def get_user_activities(
+ self,
+ username: str,
+ activity_type: Optional[str] = None,
+ limit: int = 50,
+ offset: int = 0
+ ) -> list:
+ """์ฌ์ฉ์ ํ๋ ์ด๋ ฅ ์กฐํ"""
+ try:
+ where_clause = "WHERE username = :username"
+ params = {'username': username}
+
+ if activity_type:
+ where_clause += " AND activity_type = :activity_type"
+ params['activity_type'] = activity_type
+
+ query = text(f"""
+ SELECT
+ id, activity_type, activity_description,
+ target_id, target_type, metadata, created_at
+ FROM user_activity_logs
+ {where_clause}
+ ORDER BY created_at DESC
+ LIMIT :limit OFFSET :offset
+ """)
+
+ params.update({'limit': limit, 'offset': offset})
+ result = self.db.execute(query, params)
+
+ activities = []
+ for row in result.fetchall():
+ activity = {
+ 'id': row[0],
+ 'activity_type': row[1],
+ 'activity_description': row[2],
+ 'target_id': row[3],
+ 'target_type': row[4],
+ 'metadata': json.loads(row[5]) if row[5] else {},
+ 'created_at': row[6].isoformat() if row[6] else None
+ }
+ activities.append(activity)
+
+ return activities
+
+ except Exception as e:
+ logger.error(f"Failed to get user activities: {str(e)}")
+ return []
+
+ def get_recent_activities(
+ self,
+ days: int = 7,
+ limit: int = 100
+ ) -> list:
+ """์ต๊ทผ ํ๋ ์กฐํ (์ ์ฒด ์ฌ์ฉ์)"""
+ try:
+ query = text("""
+ SELECT
+ username, activity_type, activity_description,
+ target_id, target_type, created_at
+ FROM user_activity_logs
+ WHERE created_at >= CURRENT_TIMESTAMP - INTERVAL '%s days'
+ ORDER BY created_at DESC
+ LIMIT :limit
+ """ % days)
+
+ result = self.db.execute(query, {'limit': limit})
+
+ activities = []
+ for row in result.fetchall():
+ activity = {
+ 'username': row[0],
+ 'activity_type': row[1],
+ 'activity_description': row[2],
+ 'target_id': row[3],
+ 'target_type': row[4],
+ 'created_at': row[5].isoformat() if row[5] else None
+ }
+ activities.append(activity)
+
+ return activities
+
+ except Exception as e:
+ logger.error(f"Failed to get recent activities: {str(e)}")
+ return []
+
+
+def get_client_info(request: Request) -> tuple:
+ """
+ ์์ฒญ์์ ํด๋ผ์ด์ธํธ ์ ๋ณด ์ถ์ถ
+
+ Args:
+ request: FastAPI Request ๊ฐ์ฒด
+
+ Returns:
+ tuple: (ip_address, user_agent)
+ """
+ # IP ์ฃผ์ ์ถ์ถ (ํ๋ก์ ๊ณ ๋ ค)
+ ip_address = (
+ request.headers.get('x-forwarded-for', '').split(',')[0].strip() or
+ request.headers.get('x-real-ip', '') or
+ request.client.host if request.client else 'unknown'
+ )
+
+ # User-Agent ์ถ์ถ
+ user_agent = request.headers.get('user-agent', 'unknown')
+
+ return ip_address, user_agent
+
+
+def log_activity_from_request(
+ db: Session,
+ request: Request,
+ username: str,
+ activity_type: str,
+ activity_description: str,
+ target_id: Optional[int] = None,
+ target_type: Optional[str] = None,
+ user_id: Optional[int] = None,
+ metadata: Optional[Dict[str, Any]] = None
+) -> int:
+ """
+ ์์ฒญ ์ ๋ณด๋ฅผ ํฌํจํ ํ๋ ๋ก๊ทธ ๊ธฐ๋ก (ํธ์ ํจ์)
+
+ Args:
+ db: ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ธ์
+ request: FastAPI Request ๊ฐ์ฒด
+ username: ์ฌ์ฉ์๋ช
+ activity_type: ํ๋ ์ ํ
+ activity_description: ํ๋ ์ค๋ช
+ target_id: ๋์ ID
+ target_type: ๋์ ์ ํ
+ user_id: ์ฌ์ฉ์ ID
+ metadata: ์ถ๊ฐ ๋ฉํ๋ฐ์ดํฐ
+
+ Returns:
+ int: ์์ฑ๋ ๋ก๊ทธ ID
+ """
+ ip_address, user_agent = get_client_info(request)
+
+ activity_logger = ActivityLogger(db)
+ return activity_logger.log_activity(
+ username=username,
+ activity_type=activity_type,
+ activity_description=activity_description,
+ target_id=target_id,
+ target_type=target_type,
+ user_id=user_id,
+ ip_address=ip_address,
+ user_agent=user_agent,
+ metadata=metadata
+ )
diff --git a/backend/app/services/pipe_classifier.py b/backend/app/services/pipe_classifier.py
index f2661a8..dfe456e 100644
--- a/backend/app/services/pipe_classifier.py
+++ b/backend/app/services/pipe_classifier.py
@@ -29,13 +29,13 @@ PIPE_MANUFACTURING = {
# ========== PIPE ๋ ๊ฐ๊ณต๋ณ ๋ถ๋ฅ ==========
PIPE_END_PREP = {
"BOTH_ENDS_BEVELED": {
- "codes": ["BOE", "BOTH END", "BOTH BEVELED", "์์ชฝ๊ฐ์ "],
+ "codes": ["BBE", "BOE", "BOTH END", "BOTH BEVELED", "์์ชฝ๊ฐ์ "],
"cutting_note": "์์ชฝ ๊ฐ์ ",
"machining_required": True,
"confidence": 0.95
},
"ONE_END_BEVELED": {
- "codes": ["BE", "BEV", "PBE", "PIPE BEVELED END"],
+ "codes": ["BE", "BEV", "PBE", "PIPE BEVELED END", "POE"],
"cutting_note": "ํ์ชฝ ๊ฐ์ ",
"machining_required": True,
"confidence": 0.95
@@ -45,9 +45,85 @@ PIPE_END_PREP = {
"cutting_note": "๋ฌด ๊ฐ์ ",
"machining_required": False,
"confidence": 0.95
+ },
+ "THREADED": {
+ "codes": ["TOE", "THE", "THREADED", "๋์ฌ", "์ค๋ ๋"],
+ "cutting_note": "๋์ฌ ๊ฐ๊ณต",
+ "machining_required": True,
+ "confidence": 0.90
}
}
+# ========== ๊ตฌ๋งค์ฉ ํ์ดํ ๋ถ๋ฅ (๋๋จ ๊ฐ๊ณต ์ ์ธ) ==========
+def get_purchase_pipe_description(description: str) -> str:
+ """๊ตฌ๋งค์ฉ ํ์ดํ ์ค๋ช
- ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ๊ฑฐ"""
+
+ # ๋ชจ๋ ๋๋จ ๊ฐ๊ณต ์ฝ๋๋ค์ ์์ง
+ end_prep_codes = []
+ for prep_data in PIPE_END_PREP.values():
+ end_prep_codes.extend(prep_data["codes"])
+
+ # ์ค๋ช
์์ ๋๋จ ๊ฐ๊ณต ์ฝ๋ ์ ๊ฑฐ
+ clean_description = description.upper()
+
+ # ๋๋จ ๊ฐ๊ณต ์ฝ๋๋ค์ ๊ธธ์ด ์์ผ๋ก ์ ๋ ฌ (๊ธด ๊ฒ๋ถํฐ ์ฒ๋ฆฌ)
+ end_prep_codes.sort(key=len, reverse=True)
+
+ for code in end_prep_codes:
+ # ๋จ์ด ๊ฒฝ๊ณ๋ฅผ ๊ณ ๋ คํ์ฌ ์ ๊ฑฐ (๋ถ๋ถ ๋งค์นญ ๋ฐฉ์ง)
+ pattern = r'\b' + re.escape(code) + r'\b'
+ clean_description = re.sub(pattern, '', clean_description, flags=re.IGNORECASE)
+
+ # ๋๋จ ๊ฐ๊ณต ๊ด๋ จ ํจํด๋ค ์ถ๊ฐ ์ ๊ฑฐ
+ # BOE-POE, POE-TOE ๊ฐ์ ์กฐํฉ ํจํด๋ค
+ end_prep_patterns = [
+ r'\b[A-Z]{2,3}E-[A-Z]{2,3}E\b', # BOE-POE, POE-TOE ๋ฑ
+ r'\b[A-Z]{2,3}E-[A-Z]{2,3}\b', # BOE-TO, POE-TO ๋ฑ
+ r'\b[A-Z]{2,3}-[A-Z]{2,3}E\b', # BO-POE, PO-TOE ๋ฑ
+ r'\b[A-Z]{2,3}-[A-Z]{2,3}\b', # BO-PO, PO-TO ๋ฑ
+ ]
+
+ for pattern in end_prep_patterns:
+ clean_description = re.sub(pattern, '', clean_description, flags=re.IGNORECASE)
+
+ # ๋จ์ ํ์ดํ๊ณผ ๊ณต๋ฐฑ ์ ๋ฆฌ
+ clean_description = re.sub(r'\s*-\s*', ' ', clean_description) # ํ์ดํ ์ ๊ฑฐ
+ clean_description = re.sub(r'\s+', ' ', clean_description).strip() # ์ฐ์ ๊ณต๋ฐฑ ์ ๋ฆฌ
+
+ return clean_description
+
+def extract_end_preparation_info(description: str) -> Dict:
+ """ํ์ดํ ์ค๋ช
์์ ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ถ์ถ"""
+
+ desc_upper = description.upper()
+
+ # ๋๋จ ๊ฐ๊ณต ์ฝ๋ ์ฐพ๊ธฐ
+ for prep_type, prep_data in PIPE_END_PREP.items():
+ for code in prep_data["codes"]:
+ if code in desc_upper:
+ return {
+ "end_preparation_type": prep_type,
+ "end_preparation_code": code,
+ "machining_required": prep_data["machining_required"],
+ "cutting_note": prep_data["cutting_note"],
+ "confidence": prep_data["confidence"],
+ "matched_pattern": code,
+ "original_description": description,
+ "clean_description": get_purchase_pipe_description(description)
+ }
+
+ # ๊ธฐ๋ณธ๊ฐ: PBE (์์ชฝ ๋ฌด๊ฐ์ )
+ return {
+ "end_preparation_type": "NO_BEVEL", # PBE๋ก ๋งคํ๋ ์์
+ "end_preparation_code": "PBE",
+ "machining_required": False,
+ "cutting_note": "์์ชฝ ๋ฌด๊ฐ์ (๊ธฐ๋ณธ๊ฐ)",
+ "confidence": 0.5,
+ "matched_pattern": "DEFAULT",
+ "original_description": description,
+ "clean_description": get_purchase_pipe_description(description)
+ }
+
# ========== PIPE ์ค์ผ์ค๋ณ ๋ถ๋ฅ ==========
PIPE_SCHEDULE = {
"patterns": [
@@ -62,6 +138,23 @@ PIPE_SCHEDULE = {
]
}
+def classify_pipe_for_purchase(dat_file: str, description: str, main_nom: str,
+ length: Optional[float] = None) -> Dict:
+ """๊ตฌ๋งค์ฉ ํ์ดํ ๋ถ๋ฅ - ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ์ธ"""
+
+ # ๋๋จ ๊ฐ๊ณต ์ ๋ณด ์ ๊ฑฐํ ์ค๋ช
์ผ๋ก ๋ถ๋ฅ
+ clean_description = get_purchase_pipe_description(description)
+
+ # ๊ธฐ๋ณธ ํ์ดํ ๋ถ๋ฅ ์ํ
+ result = classify_pipe(dat_file, clean_description, main_nom, length)
+
+ # ๊ตฌ๋งค์ฉ์์ ํ์
+ result["purchase_classification"] = True
+ result["original_description"] = description
+ result["clean_description"] = clean_description
+
+ return result
+
def classify_pipe(dat_file: str, description: str, main_nom: str,
length: Optional[float] = None) -> Dict:
"""
diff --git a/backend/app/services/revision_comparator.py b/backend/app/services/revision_comparator.py
new file mode 100644
index 0000000..a89c010
--- /dev/null
+++ b/backend/app/services/revision_comparator.py
@@ -0,0 +1,289 @@
+"""
+๋ฆฌ๋น์ ๋น๊ต ์๋น์ค
+- ๊ธฐ์กด ํ์ ์์ฌ์ ์ ๊ท ์์ฌ ๋น๊ต
+- ๋ณ๊ฒฝ๋ ์์ฌ๋ง ๋ถ๋ฅ ์ฒ๋ฆฌ
+- ๋ฆฌ๋น์ ์
๋ก๋ ์ต์ ํ
+"""
+
+from sqlalchemy.orm import Session
+from sqlalchemy import text
+from typing import List, Dict, Tuple, Optional
+import hashlib
+import logging
+
+logger = logging.getLogger(__name__)
+
+class RevisionComparator:
+ """๋ฆฌ๋น์ ๋น๊ต ๋ฐ ์ฐจ์ด ๋ถ์ ํด๋์ค"""
+
+ def __init__(self, db: Session):
+ self.db = db
+
+ def get_previous_confirmed_materials(self, job_no: str, current_revision: str) -> Optional[Dict]:
+ """
+ ์ด์ ํ์ ๋ ์์ฌ ๋ชฉ๋ก ์กฐํ
+
+ Args:
+ job_no: ํ๋ก์ ํธ ๋ฒํธ
+ current_revision: ํ์ฌ ๋ฆฌ๋น์ (์: Rev.1)
+
+ Returns:
+ ํ์ ๋ ์์ฌ ์ ๋ณด ๋์
๋๋ฆฌ ๋๋ None
+ """
+ try:
+ # ํ์ฌ ๋ฆฌ๋น์ ๋ฒํธ ์ถ์ถ
+ current_rev_num = self._extract_revision_number(current_revision)
+
+ # ์ด์ ๋ฆฌ๋น์ ๋ค ์ค ํ์ ๋ ๊ฒ ์ฐพ๊ธฐ (์ญ์์ผ๋ก ๊ฒ์)
+ for prev_rev_num in range(current_rev_num - 1, -1, -1):
+ prev_revision = f"Rev.{prev_rev_num}"
+
+ # ํด๋น ๋ฆฌ๋น์ ์ ํ์ ๋ฐ์ดํฐ ์กฐํ
+ query = text("""
+ SELECT pc.id, pc.revision, pc.confirmed_at, pc.confirmed_by,
+ COUNT(cpi.id) as confirmed_items_count
+ FROM purchase_confirmations pc
+ LEFT JOIN confirmed_purchase_items cpi ON pc.id = cpi.confirmation_id
+ WHERE pc.job_no = :job_no
+ AND pc.revision = :revision
+ AND pc.is_active = TRUE
+ GROUP BY pc.id, pc.revision, pc.confirmed_at, pc.confirmed_by
+ ORDER BY pc.confirmed_at DESC
+ LIMIT 1
+ """)
+
+ result = self.db.execute(query, {
+ "job_no": job_no,
+ "revision": prev_revision
+ }).fetchone()
+
+ if result and result.confirmed_items_count > 0:
+ logger.info(f"์ด์ ํ์ ์๋ฃ ๋ฐ๊ฒฌ: {job_no} {prev_revision} ({result.confirmed_items_count}๊ฐ ํ๋ชฉ)")
+
+ # ํ์ ๋ ํ๋ชฉ๋ค ์์ธ ์กฐํ
+ items_query = text("""
+ SELECT cpi.item_code, cpi.category, cpi.specification,
+ cpi.size, cpi.material, cpi.bom_quantity,
+ cpi.calculated_qty, cpi.unit, cpi.safety_factor
+ FROM confirmed_purchase_items cpi
+ WHERE cpi.confirmation_id = :confirmation_id
+ ORDER BY cpi.category, cpi.specification
+ """)
+
+ items_result = self.db.execute(items_query, {
+ "confirmation_id": result.id
+ }).fetchall()
+
+ return {
+ "confirmation_id": result.id,
+ "revision": result.revision,
+ "confirmed_at": result.confirmed_at,
+ "confirmed_by": result.confirmed_by,
+ "items": [dict(item) for item in items_result],
+ "items_count": len(items_result)
+ }
+
+ logger.info(f"์ด์ ํ์ ์๋ฃ ์์: {job_no} (ํ์ฌ: {current_revision})")
+ return None
+
+ except Exception as e:
+ logger.error(f"์ด์ ํ์ ์๋ฃ ์กฐํ ์คํจ: {str(e)}")
+ return None
+
+ def compare_materials(self, previous_confirmed: Dict, new_materials: List[Dict]) -> Dict:
+ """
+ ๊ธฐ์กด ํ์ ์์ฌ์ ์ ๊ท ์์ฌ ๋น๊ต
+
+ Args:
+ previous_confirmed: ์ด์ ํ์ ์์ฌ ์ ๋ณด
+ new_materials: ์ ๊ท ์
๋ก๋๋ ์์ฌ ๋ชฉ๋ก
+
+ Returns:
+ ๋น๊ต ๊ฒฐ๊ณผ ๋์
๋๋ฆฌ
+ """
+ try:
+ # ์ด์ ํ์ ์์ฌ๋ฅผ ํด์๋งต์ผ๋ก ๋ณํ (๋น ๋ฅธ ๊ฒ์์ ์ํด)
+ confirmed_materials = {}
+ for item in previous_confirmed["items"]:
+ material_hash = self._generate_material_hash(
+ item["specification"],
+ item["size"],
+ item["material"]
+ )
+ confirmed_materials[material_hash] = item
+
+ # ์ ๊ท ์์ฌ ๋ถ์
+ unchanged_materials = [] # ๋ณ๊ฒฝ ์์ (๋ถ๋ฅ ๋ถํ์)
+ changed_materials = [] # ๋ณ๊ฒฝ๋จ (์ฌ๋ถ๋ฅ ํ์)
+ new_materials_list = [] # ์ ๊ท ์ถ๊ฐ (๋ถ๋ฅ ํ์)
+
+ for new_material in new_materials:
+ # ์์ฌ ํด์ ์์ฑ (description ๊ธฐ๋ฐ)
+ description = new_material.get("description", "")
+ size = self._extract_size_from_description(description)
+ material = self._extract_material_from_description(description)
+
+ material_hash = self._generate_material_hash(description, size, material)
+
+ if material_hash in confirmed_materials:
+ confirmed_item = confirmed_materials[material_hash]
+
+ # ์๋ ๋น๊ต
+ new_qty = float(new_material.get("quantity", 0))
+ confirmed_qty = float(confirmed_item["bom_quantity"])
+
+ if abs(new_qty - confirmed_qty) > 0.001: # ์๋ ๋ณ๊ฒฝ
+ changed_materials.append({
+ **new_material,
+ "change_type": "QUANTITY_CHANGED",
+ "previous_quantity": confirmed_qty,
+ "previous_item": confirmed_item
+ })
+ else:
+ # ์๋ ๋์ผ - ๊ธฐ์กด ๋ถ๋ฅ ๊ฒฐ๊ณผ ์ฌ์ฌ์ฉ
+ unchanged_materials.append({
+ **new_material,
+ "reuse_classification": True,
+ "previous_item": confirmed_item
+ })
+ else:
+ # ์ ๊ท ์์ฌ
+ new_materials_list.append({
+ **new_material,
+ "change_type": "NEW_MATERIAL"
+ })
+
+ # ์ญ์ ๋ ์์ฌ ์ฐพ๊ธฐ (์ด์ ์๋ ์์์ง๋ง ํ์ฌ๋ ์๋ ๊ฒ)
+ new_material_hashes = set()
+ for material in new_materials:
+ description = material.get("description", "")
+ size = self._extract_size_from_description(description)
+ material_grade = self._extract_material_from_description(description)
+ hash_key = self._generate_material_hash(description, size, material_grade)
+ new_material_hashes.add(hash_key)
+
+ removed_materials = []
+ for hash_key, confirmed_item in confirmed_materials.items():
+ if hash_key not in new_material_hashes:
+ removed_materials.append({
+ "change_type": "REMOVED",
+ "previous_item": confirmed_item
+ })
+
+ comparison_result = {
+ "has_previous_confirmation": True,
+ "previous_revision": previous_confirmed["revision"],
+ "previous_confirmed_at": previous_confirmed["confirmed_at"],
+ "unchanged_count": len(unchanged_materials),
+ "changed_count": len(changed_materials),
+ "new_count": len(new_materials_list),
+ "removed_count": len(removed_materials),
+ "total_materials": len(new_materials),
+ "classification_needed": len(changed_materials) + len(new_materials_list),
+ "unchanged_materials": unchanged_materials,
+ "changed_materials": changed_materials,
+ "new_materials": new_materials_list,
+ "removed_materials": removed_materials
+ }
+
+ logger.info(f"๋ฆฌ๋น์ ๋น๊ต ์๋ฃ: ๋ณ๊ฒฝ์์ {len(unchanged_materials)}, "
+ f"๋ณ๊ฒฝ๋จ {len(changed_materials)}, ์ ๊ท {len(new_materials_list)}, "
+ f"์ญ์ ๋จ {len(removed_materials)}")
+
+ return comparison_result
+
+ except Exception as e:
+ logger.error(f"์์ฌ ๋น๊ต ์คํจ: {str(e)}")
+ raise
+
+ def _extract_revision_number(self, revision: str) -> int:
+ """๋ฆฌ๋น์ ๋ฌธ์์ด์์ ์ซ์ ์ถ์ถ (Rev.1 โ 1)"""
+ try:
+ if revision.startswith("Rev."):
+ return int(revision.replace("Rev.", ""))
+ return 0
+ except ValueError:
+ return 0
+
+ def _generate_material_hash(self, description: str, size: str, material: str) -> str:
+ """์์ฌ ๊ณ ์ ์ฑ ํ๋จ์ ์ํ ํด์ ์์ฑ"""
+ # RULES.md์ ์ฝ๋ฉ ์ปจ๋ฒค์
์ค์
+ hash_input = f"{description}|{size}|{material}".lower().strip()
+ return hashlib.md5(hash_input.encode()).hexdigest()
+
+ def _extract_size_from_description(self, description: str) -> str:
+ """์์ฌ ์ค๋ช
์์ ์ฌ์ด์ฆ ์ ๋ณด ์ถ์ถ"""
+ # ๊ฐ๋จํ ์ฌ์ด์ฆ ํจํด ์ถ์ถ (์ค์ ๋ก๋ ๋ ์ ๊ตํ ๋ก์ง ํ์)
+ import re
+ size_patterns = [
+ r'(\d+(?:\.\d+)?)\s*(?:mm|MM|์ธ์น|inch|")',
+ r'(\d+(?:\.\d+)?)\s*x\s*(\d+(?:\.\d+)?)',
+ r'DN\s*(\d+)',
+ r'(\d+)\s*A'
+ ]
+
+ for pattern in size_patterns:
+ match = re.search(pattern, description, re.IGNORECASE)
+ if match:
+ return match.group(0)
+
+ return ""
+
+ def _extract_material_from_description(self, description: str) -> str:
+ """์์ฌ ์ค๋ช
์์ ์ฌ์ง ์ ๋ณด ์ถ์ถ"""
+ # ์ผ๋ฐ์ ์ธ ์ฌ์ง ํจํด
+ materials = ["SS304", "SS316", "SS316L", "A105", "WCB", "CF8M", "CF8", "CS"]
+
+ description_upper = description.upper()
+ for material in materials:
+ if material in description_upper:
+ return material
+
+ return ""
+
+def get_revision_comparison(db: Session, job_no: str, current_revision: str,
+ new_materials: List[Dict]) -> Dict:
+ """
+ ๋ฆฌ๋น์ ๋น๊ต ์ํ (ํธ์ ํจ์)
+
+ Args:
+ db: ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ธ์
+ job_no: ํ๋ก์ ํธ ๋ฒํธ
+ current_revision: ํ์ฌ ๋ฆฌ๋น์
+ new_materials: ์ ๊ท ์์ฌ ๋ชฉ๋ก
+
+ Returns:
+ ๋น๊ต ๊ฒฐ๊ณผ ๋๋ ์ ์ฒด ๋ถ๋ฅ ํ์ ์ ๋ณด
+ """
+ comparator = RevisionComparator(db)
+
+ # ์ด์ ํ์ ์๋ฃ ์กฐํ
+ previous_confirmed = comparator.get_previous_confirmed_materials(job_no, current_revision)
+
+ if previous_confirmed is None:
+ # ์ด์ ํ์ ์๋ฃ๊ฐ ์์ผ๋ฉด ์ ์ฒด ๋ถ๋ฅ ํ์
+ return {
+ "has_previous_confirmation": False,
+ "classification_needed": len(new_materials),
+ "all_materials_need_classification": True,
+ "materials_to_classify": new_materials,
+ "message": "์ด์ ํ์ ์๋ฃ๊ฐ ์์ด ์ ์ฒด ์์ฌ๋ฅผ ๋ถ๋ฅํฉ๋๋ค."
+ }
+
+ # ์ด์ ํ์ ์๋ฃ๊ฐ ์์ผ๋ฉด ๋น๊ต ์ํ
+ return comparator.compare_materials(previous_confirmed, new_materials)
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/debug_step_by_step.py b/backend/debug_step_by_step.py
deleted file mode 100644
index e9392db..0000000
--- a/backend/debug_step_by_step.py
+++ /dev/null
@@ -1,41 +0,0 @@
-from app.services.integrated_classifier import LEVEL1_TYPE_KEYWORDS
-
-test = "NIPPLE, SMLS, SCH 80, ASTM A106 GR B PBE"
-print(f"ํ
์คํธ: {test}")
-
-desc_upper = test.upper()
-desc_parts = [part.strip() for part in desc_upper.split(',')]
-
-print(f"๋๋ฌธ์ ๋ณํ: {desc_upper}")
-print(f"์ผํ ๋ถ๋ฆฌ: {desc_parts}")
-
-# ๋จ๊ณ๋ณ ๋๋ฒ๊น
-detected_types = []
-for material_type, keywords in LEVEL1_TYPE_KEYWORDS.items():
- type_found = False
- for keyword in keywords:
- # ์ ์ฒด ๋ฌธ์์ด์์ ์ฐพ๊ธฐ
- if keyword in desc_upper:
- print(f"โ {material_type}: '{keyword}' ์ ์ฒด ๋ฌธ์์ด์์ ๋ฐ๊ฒฌ")
- detected_types.append((material_type, keyword))
- type_found = True
- break
- # ๊ฐ ๋ถ๋ถ์์๋ ์ ํํ ๋งค์นญ๋๋์ง ํ์ธ
- for part in desc_parts:
- if keyword == part or keyword in part:
- print(f"โ {material_type}: '{keyword}' ๋ถ๋ถ '{part}'์์ ๋ฐ๊ฒฌ")
- detected_types.append((material_type, keyword))
- type_found = True
- break
- if type_found:
- break
-
-print(f"\n๊ฐ์ง๋ ํ์
๋ค: {detected_types}")
-print(f"๊ฐ์ง๋ ํ์
๊ฐ์: {len(detected_types)}")
-
-if len(detected_types) == 1:
- print(f"๋จ์ผ ํ์
ํ์ : {detected_types[0][0]}")
-elif len(detected_types) > 1:
- print(f"๋ณต์ ํ์
๊ฐ์ง: {detected_types}")
-else:
- print("Level 1 ํค์๋ ์์ - ์ฌ์ง ๊ธฐ๋ฐ ๋ถ๋ฅ๋ก ์ด๋")
\ No newline at end of file
diff --git a/backend/example_corrected_spool_usage.py b/backend/example_corrected_spool_usage.py
deleted file mode 100644
index 40e6afb..0000000
--- a/backend/example_corrected_spool_usage.py
+++ /dev/null
@@ -1,30 +0,0 @@
-"""
-์์ ๋ ์คํ ์์คํ
์ฌ์ฉ ์์
-"""
-
-# ์๋๋ฆฌ์ค: A-1 ๋๋ฉด์์ ํ์ดํ 3๊ฐ ๋ฐ๊ฒฌ
-examples = [
- {
- "dwg_name": "A-1",
- "pipes": [
- {"description": "PIPE 1", "user_input_spool": "A"}, # A-1-A
- {"description": "PIPE 2", "user_input_spool": "A"}, # A-1-A (๊ฐ์ ์คํ)
- {"description": "PIPE 3", "user_input_spool": "B"} # A-1-B (๋ค๋ฅธ ์คํ)
- ],
- "area_assignment": "#01" # ๋ณ๋: A-1 ๋๋ฉด์ #01 ๊ตฌ์ญ์ ์์น
- }
-]
-
-# ๊ฒฐ๊ณผ:
-spool_identifiers = [
- "A-1-A", # ํ์ดํ 1, 2๊ฐ ์ํจ
- "A-1-B" # ํ์ดํ 3์ด ์ํจ
-]
-
-area_assignment = {
- "#01": ["A-1"] # A-1 ๋๋ฉด์ #01 ๊ตฌ์ญ์ ๋ฌผ๋ฆฌ์ ์ผ๋ก ์์น
-}
-
-print("โ
์์ ๋ ์คํ ๊ตฌ์กฐ๊ฐ ์ ์ฉ๋์์ต๋๋ค!")
-print(f"์คํ ์๋ณ์: {spool_identifiers}")
-print(f"์๋ฆฌ์ด ํ ๋น: {area_assignment}")
diff --git a/backend/scripts/03_insert_dummy_data.py b/backend/scripts/03_insert_dummy_data.py
deleted file mode 100644
index 48220e5..0000000
--- a/backend/scripts/03_insert_dummy_data.py
+++ /dev/null
@@ -1,108 +0,0 @@
-#!/usr/bin/env python3
-"""
-๋๋ฏธ ํ๋ก์ ํธ ๋ฐ์ดํฐ ์์ฑ ์คํฌ๋ฆฝํธ
-"""
-
-import sys
-import os
-from datetime import datetime, date
-from sqlalchemy import create_engine, text
-
-# ํ๋ก์ ํธ ๋ฃจํธ๋ฅผ Python path์ ์ถ๊ฐ
-sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
-
-def create_dummy_jobs():
- """๋๋ฏธ Job ๋ฐ์ดํฐ ์์ฑ"""
-
- # ๊ฐ๋จํ SQLite ์ฐ๊ฒฐ (์ค์ DB ์ค์ ์ ๋ง๊ฒ ์์ )
- try:
- # ์ค์ ํ๋ก์ ํธ์ database.py ์ค์ ์ฌ์ฉ
- from app.database import engine
- print("โ
๋ฐ์ดํฐ๋ฒ ์ด์ค ์ฐ๊ฒฐ ์ฑ๊ณต")
- except ImportError:
- # ์ง์ ์ฐ๊ฒฐ (๊ฐ๋ฐ์ฉ)
- DATABASE_URL = "sqlite:///./test.db" # ์ค์ DB URL๋ก ๋ณ๊ฒฝ
- engine = create_engine(DATABASE_URL)
- print("โ ๏ธ ์ง์ ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ฐ๊ฒฐ")
-
- # ๋๋ฏธ ๋ฐ์ดํฐ ์ ์
- dummy_jobs = [
- {
- 'job_no': 'J24-001',
- 'job_name': '์ธ์ฐ SK์๋์ง ์ ์ ์์ค ์ฆ์ค ๋ฐฐ๊ด๊ณต์ฌ',
- 'client_name': '์ผ์ฑ์์ง๋์ด๋ง',
- 'end_user': 'SK์๋์ง',
- 'epc_company': '์ผ์ฑ์์ง๋์ด๋ง',
- 'project_site': '์ธ์ฐ๊ด์ญ์ ์จ์ฐ๊ณต๋จ SK์๋์ง ์ ์ ๊ณต์ฅ',
- 'contract_date': '2024-03-15',
- 'delivery_date': '2024-08-30',
- 'delivery_terms': 'FOB ์ธ์ฐํญ',
- 'status': '์งํ์ค',
- 'description': '์ ์ ์์ค ์ฆ์ค์ ์ํ ๋ฐฐ๊ด ์์ฌ ๊ณต๊ธ ํ๋ก์ ํธ. ๊ณ ์จ๊ณ ์ ๋ฐฐ๊ด ๋ฐ ํน์ ๋ฐธ๋ธ ํฌํจ.',
- 'created_by': 'admin'
- },
- {
- 'job_no': 'J24-002',
- 'job_name': 'ํฌ์ค์ฝ ๊ด์ ์ ์ฒ ์ ๋ฐฐ๊ด ์ ๋น๊ณต์ฌ',
- 'client_name': 'ํฌ์ค์ฝ',
- 'end_user': 'ํฌ์ค์ฝ',
- 'epc_company': None,
- 'project_site': '์ ๋จ ๊ด์์ ํฌ์ค์ฝ ๊ด์์ ์ฒ ์',
- 'contract_date': '2024-04-02',
- 'delivery_date': '2024-07-15',
- 'delivery_terms': 'DDP ๊ด์์ ์ฒ ์ ํ์ฅ',
- 'status': '์งํ์ค',
- 'description': '์ ์ฒ ์ ์ ๊ธฐ ์ ๋น๋ฅผ ์ํ ๋ฐฐ๊ด ๋ถํ ๊ต์ฒด. ๋ด์ด์ฑ ํน์๊ฐ ๋ฐฐ๊ด ํฌํจ.',
- 'created_by': 'admin'
- }
- ]
-
- try:
- with engine.connect() as conn:
- # ๊ธฐ์กด ๋๋ฏธ ๋ฐ์ดํฐ ์ญ์ (๊ฐ๋ฐ์ฉ)
- print("๐งน ๊ธฐ์กด ๋๋ฏธ ๋ฐ์ดํฐ ์ ๋ฆฌ...")
- conn.execute(text("DELETE FROM jobs WHERE job_no IN ('J24-001', 'J24-002')"))
-
- # ์ ๋๋ฏธ ๋ฐ์ดํฐ ์ฝ์
- print("๐ ๋๋ฏธ ๋ฐ์ดํฐ ์ฝ์
์ค...")
-
- for job in dummy_jobs:
- insert_query = text("""
- INSERT INTO jobs (
- job_no, job_name, client_name, end_user, epc_company,
- project_site, contract_date, delivery_date, delivery_terms,
- status, description, created_by, is_active
- ) VALUES (
- :job_no, :job_name, :client_name, :end_user, :epc_company,
- :project_site, :contract_date, :delivery_date, :delivery_terms,
- :status, :description, :created_by, :is_active
- )
- """)
-
- conn.execute(insert_query, {**job, 'is_active': True})
- print(f"โ
{job['job_no']}: {job['job_name']}")
-
- # ์ปค๋ฐ
- conn.commit()
-
- # ๊ฒฐ๊ณผ ํ์ธ
- result = conn.execute(text("""
- SELECT job_no, job_name, client_name, status
- FROM jobs
- WHERE job_no IN ('J24-001', 'J24-002')
- """))
- jobs = result.fetchall()
-
- print(f"\n๐ ์ด {len(jobs)}๊ฐ ๋๋ฏธ Job ์์ฑ ์๋ฃ!")
- print("\n๐ ์์ฑ๋ ๋๋ฏธ ๋ฐ์ดํฐ:")
- for job in jobs:
- print(f" โข {job[0]}: {job[1]} ({job[2]}) - {job[3]}")
-
- return True
-
- except Exception as e:
- print(f"โ ๋๋ฏธ ๋ฐ์ดํฐ ์์ฑ ์คํจ: {e}")
- return False
-
-if __name__ == "__main__":
- create_dummy_jobs()
diff --git a/backend/scripts/18_create_auth_tables.sql b/backend/scripts/18_create_auth_tables.sql
index e14b541..54f5cb9 100644
--- a/backend/scripts/18_create_auth_tables.sql
+++ b/backend/scripts/18_create_auth_tables.sql
@@ -218,3 +218,19 @@ BEGIN
RAISE NOTICE '๐ค ๊ธฐ๋ณธ ๊ณ์ : admin/admin123, system/admin123';
RAISE NOTICE '๐ ๊ถํ ์์คํ
: 5๋จ๊ณ ์ญํ + ๋ชจ๋๋ณ ์ธ๋ถํ๋ ๊ถํ';
END $$;
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/backend/scripts/19_add_user_tracking_fields.sql b/backend/scripts/19_add_user_tracking_fields.sql
new file mode 100644
index 0000000..c6df4e3
--- /dev/null
+++ b/backend/scripts/19_add_user_tracking_fields.sql
@@ -0,0 +1,142 @@
+-- ์ฌ์ฉ์ ์ถ์ ๋ฐ ๋ด๋น์ ๊ธฐ๋ก ํ๋ ์ถ๊ฐ
+-- ์์ฑ์ผ: 2025.01
+-- ๋ชฉ์ : RULES ๊ฐ์ด๋๋ผ์ธ์ ๋ฐ๋ฅธ ์ฌ์ฉ์ ์ถ์ ์์คํ
๊ตฌ์ถ
+
+-- ================================
+-- 1. ๊ธฐ์กด ํ
์ด๋ธ์ ๋ด๋น์ ํ๋ ์ถ๊ฐ
+-- ================================
+
+-- files ํ
์ด๋ธ ์์ (uploaded_by๋ ์ด๋ฏธ ์กด์ฌ)
+ALTER TABLE files
+ADD COLUMN IF NOT EXISTS updated_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP;
+
+-- jobs ํ
์ด๋ธ ์์
+ALTER TABLE jobs
+ADD COLUMN IF NOT EXISTS created_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS updated_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS assigned_to VARCHAR(100);
+
+-- materials ํ
์ด๋ธ ์์
+ALTER TABLE materials
+ADD COLUMN IF NOT EXISTS classified_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS classified_at TIMESTAMP,
+ADD COLUMN IF NOT EXISTS updated_by VARCHAR(100);
+
+-- ================================
+-- 2. ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ํ
์ด๋ธ ์์ฑ
+-- ================================
+
+CREATE TABLE IF NOT EXISTS user_activity_logs (
+ id SERIAL PRIMARY KEY,
+ user_id INTEGER, -- users ํ
์ด๋ธ ์ฐธ์กฐ (์ธ๋ํค ์ ์ฝ ์์ - ์ ์ฐ์ฑ)
+ username VARCHAR(100) NOT NULL, -- ์ฌ์ฉ์๋ช
(ํ์)
+
+ -- ํ๋ ์ ๋ณด
+ activity_type VARCHAR(50) NOT NULL, -- 'FILE_UPLOAD', 'PROJECT_CREATE', 'PURCHASE_CONFIRM' ๋ฑ
+ activity_description TEXT, -- ์์ธ ํ๋ ๋ด์ฉ
+
+ -- ๋์ ์ ๋ณด
+ target_id INTEGER, -- ๋์ ID (ํ์ผ, ํ๋ก์ ํธ ๋ฑ)
+ target_type VARCHAR(50), -- 'FILE', 'PROJECT', 'MATERIAL', 'PURCHASE' ๋ฑ
+
+ -- ์ธ์
์ ๋ณด
+ ip_address VARCHAR(45), -- IP ์ฃผ์
+ user_agent TEXT, -- ๋ธ๋ผ์ฐ์ ์ ๋ณด
+
+ -- ์ถ๊ฐ ๋ฉํ๋ฐ์ดํฐ (JSON)
+ metadata JSONB, -- ์ถ๊ฐ ์ ๋ณด (ํ์ผ ํฌ๊ธฐ, ์ฒ๋ฆฌ ์๊ฐ ๋ฑ)
+
+ -- ์๊ฐ ์ ๋ณด
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+);
+
+-- ================================
+-- 3. ๊ตฌ๋งค ๊ด๋ จ ํ
์ด๋ธ ์์
+-- ================================
+
+-- purchase_items ํ
์ด๋ธ ์์ (์ด๋ฏธ created_by ์กด์ฌํ๋์ง ํ์ธ ํ ์ถ๊ฐ)
+ALTER TABLE purchase_items
+ADD COLUMN IF NOT EXISTS updated_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS approved_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS approved_at TIMESTAMP;
+
+-- material_purchase_tracking ํ
์ด๋ธ ์์ (์ด๋ฏธ confirmed_by ์กด์ฌ)
+ALTER TABLE material_purchase_tracking
+ADD COLUMN IF NOT EXISTS ordered_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS ordered_at TIMESTAMP,
+ADD COLUMN IF NOT EXISTS approved_by VARCHAR(100),
+ADD COLUMN IF NOT EXISTS approved_at TIMESTAMP;
+
+-- ================================
+-- 4. ์ธ๋ฑ์ค ์์ฑ (์ฑ๋ฅ ์ต์ ํ)
+-- ================================
+
+-- ์ฌ์ฉ์ ํ๋ ๋ก๊ทธ ์ธ๋ฑ์ค
+CREATE INDEX IF NOT EXISTS idx_user_activity_logs_username ON user_activity_logs(username);
+CREATE INDEX IF NOT EXISTS idx_user_activity_logs_activity_type ON user_activity_logs(activity_type);
+CREATE INDEX IF NOT EXISTS idx_user_activity_logs_created_at ON user_activity_logs(created_at);
+CREATE INDEX IF NOT EXISTS idx_user_activity_logs_target ON user_activity_logs(target_type, target_id);
+
+-- ๋ด๋น์ ํ๋ ์ธ๋ฑ์ค
+CREATE INDEX IF NOT EXISTS idx_files_uploaded_by ON files(uploaded_by);
+CREATE INDEX IF NOT EXISTS idx_files_updated_by ON files(updated_by);
+CREATE INDEX IF NOT EXISTS idx_jobs_created_by ON jobs(created_by);
+CREATE INDEX IF NOT EXISTS idx_jobs_assigned_to ON jobs(assigned_to);
+CREATE INDEX IF NOT EXISTS idx_materials_classified_by ON materials(classified_by);
+
+-- ================================
+-- 5. ํธ๋ฆฌ๊ฑฐ ์์ฑ (์๋ updated_at ๊ฐฑ์ )
+-- ================================
+
+-- files ํ
์ด๋ธ updated_at ์๋ ๊ฐฑ์
+CREATE OR REPLACE FUNCTION update_files_updated_at()
+RETURNS TRIGGER AS $$
+BEGIN
+ NEW.updated_at = CURRENT_TIMESTAMP;
+ RETURN NEW;
+END;
+$$ language 'plpgsql';
+
+DROP TRIGGER IF EXISTS trigger_files_updated_at ON files;
+CREATE TRIGGER trigger_files_updated_at
+ BEFORE UPDATE ON files
+ FOR EACH ROW
+ EXECUTE FUNCTION update_files_updated_at();
+
+-- jobs ํ
์ด๋ธ updated_at ์๋ ๊ฐฑ์
+CREATE OR REPLACE FUNCTION update_jobs_updated_at()
+RETURNS TRIGGER AS $$
+BEGIN
+ NEW.updated_at = CURRENT_TIMESTAMP;
+ RETURN NEW;
+END;
+$$ language 'plpgsql';
+
+DROP TRIGGER IF EXISTS trigger_jobs_updated_at ON jobs;
+CREATE TRIGGER trigger_jobs_updated_at
+ BEFORE UPDATE ON jobs
+ FOR EACH ROW
+ EXECUTE FUNCTION update_jobs_updated_at();
+
+-- ================================
+-- 6. ๊ธฐ๋ณธ ๋ฐ์ดํฐ ์ค์
+-- ================================
+
+-- ๊ธฐ์กด ๋ฐ์ดํฐ์ ๊ธฐ๋ณธ ๋ด๋น์ ์ค์ (์์คํ
๋ง์ด๊ทธ๋ ์ด์
์ฉ)
+UPDATE files SET uploaded_by = 'system' WHERE uploaded_by IS NULL;
+UPDATE jobs SET created_by = 'system' WHERE created_by IS NULL;
+
+-- ================================
+-- 7. ๊ถํ ๋ฐ ๋ณด์ ์ค์
+-- ================================
+
+-- ํ๋ ๋ก๊ทธ ํ
์ด๋ธ์ INSERT๋ง ํ์ฉ (์์ /์ญ์ ๋ฐฉ์ง)
+-- ์ค์ ์ด์์์๋ ๋ณ๋ ๊ถํ ๊ด๋ฆฌ ํ์
+
+COMMENT ON TABLE user_activity_logs IS '์ฌ์ฉ์ ํ๋ ๋ก๊ทธ - ๋ชจ๋ ์
๋ฌด ํ๋ ์ถ์ ';
+COMMENT ON COLUMN user_activity_logs.activity_type IS 'ํ๋ ์ ํ: FILE_UPLOAD, PROJECT_CREATE, PURCHASE_CONFIRM, MATERIAL_CLASSIFY ๋ฑ';
+COMMENT ON COLUMN user_activity_logs.metadata IS '์ถ๊ฐ ์ ๋ณด JSON: ํ์ผํฌ๊ธฐ, ์ฒ๋ฆฌ์๊ฐ, ๋ณ๊ฒฝ๋ด์ฉ ๋ฑ';
+
+-- ์๋ฃ ๋ฉ์์ง
+SELECT 'User tracking system tables created successfully!' as result;
diff --git a/backend/scripts/20_add_pipe_end_preparation_table.sql b/backend/scripts/20_add_pipe_end_preparation_table.sql
new file mode 100644
index 0000000..339938f
--- /dev/null
+++ b/backend/scripts/20_add_pipe_end_preparation_table.sql
@@ -0,0 +1,49 @@
+-- ํ์ดํ ๋๋จ ๊ฐ๊ณต ์ ๋ณด ํ
์ด๋ธ ์์ฑ
+-- ๊ฐ ํ์ดํ๋ณ๋ก ๋๋จ ๊ฐ๊ณต ์ ๋ณด๋ฅผ ๋ณ๋ ์ ์ฅ
+
+CREATE TABLE IF NOT EXISTS pipe_end_preparations (
+ id SERIAL PRIMARY KEY,
+ material_id INTEGER NOT NULL REFERENCES materials(id) ON DELETE CASCADE,
+ file_id INTEGER NOT NULL REFERENCES files(id) ON DELETE CASCADE,
+
+ -- ๋๋จ ๊ฐ๊ณต ์ ๋ณด
+ end_preparation_type VARCHAR(50) DEFAULT 'PBE', -- PBE(์์ชฝ๋ฌด๊ฐ์ ), BBE(์์ชฝ๊ฐ์ ), POE(ํ์ชฝ๊ฐ์ ), PE(๋ฌด๊ฐ์ )
+ end_preparation_code VARCHAR(20), -- ์๋ณธ ์ฝ๋ (BBE, POE, PBE ๋ฑ)
+ machining_required BOOLEAN DEFAULT FALSE, -- ๊ฐ๊ณต ํ์ ์ฌ๋ถ
+ cutting_note TEXT, -- ๊ฐ๊ณต ๋ฉ๋ชจ
+
+ -- ์๋ณธ ์ ๋ณด ๋ณด์กด
+ original_description TEXT NOT NULL, -- ๋๋จ ๊ฐ๊ณต ํฌํจ๋ ์๋ณธ ์ค๋ช
+ clean_description TEXT NOT NULL, -- ๋๋จ ๊ฐ๊ณต ์ ์ธํ ๊ตฌ๋งค์ฉ ์ค๋ช
+
+ -- ๋ฉํ๋ฐ์ดํฐ
+ confidence FLOAT DEFAULT 0.0, -- ๋ถ๋ฅ ์ ๋ขฐ๋
+ matched_pattern VARCHAR(100), -- ๋งค์นญ๋ ํจํด
+
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+ updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+);
+
+-- ์ธ๋ฑ์ค ์์ฑ
+CREATE INDEX IF NOT EXISTS idx_pipe_end_preparations_material_id ON pipe_end_preparations(material_id);
+CREATE INDEX IF NOT EXISTS idx_pipe_end_preparations_file_id ON pipe_end_preparations(file_id);
+CREATE INDEX IF NOT EXISTS idx_pipe_end_preparations_type ON pipe_end_preparations(end_preparation_type);
+
+-- ๊ธฐ๋ณธ ๋๋จ ๊ฐ๊ณต ํ์
์ ์
+COMMENT ON COLUMN pipe_end_preparations.end_preparation_type IS 'PBE: ์์ชฝ๋ฌด๊ฐ์ (๊ธฐ๋ณธ๊ฐ), BBE: ์์ชฝ๊ฐ์ , POE: ํ์ชฝ๊ฐ์ , PE: ๋ฌด๊ฐ์ ';
+COMMENT ON COLUMN pipe_end_preparations.machining_required IS '๊ฐ๊ณต์ด ํ์ํ์ง ์ฌ๋ถ (๊ฐ์ ์์
๋ฑ)';
+COMMENT ON COLUMN pipe_end_preparations.clean_description IS '๊ตฌ๋งค ์ ์ฌ์ฉํ ๋๋จ ๊ฐ๊ณต ์ ๋ณด๊ฐ ์ ๊ฑฐ๋ ์ค๋ช
';
+
+-- ํธ๋ฆฌ๊ฑฐ: updated_at ์๋ ์
๋ฐ์ดํธ
+CREATE OR REPLACE FUNCTION update_pipe_end_preparations_updated_at()
+RETURNS TRIGGER AS $$
+BEGIN
+ NEW.updated_at = CURRENT_TIMESTAMP;
+ RETURN NEW;
+END;
+$$ language 'plpgsql';
+
+CREATE TRIGGER update_pipe_end_preparations_updated_at
+ BEFORE UPDATE ON pipe_end_preparations
+ FOR EACH ROW
+ EXECUTE FUNCTION update_pipe_end_preparations_updated_at();
diff --git a/backend/scripts/insert_dummy_jobs.py b/backend/scripts/insert_dummy_jobs.py
deleted file mode 100644
index 72a8e4f..0000000
--- a/backend/scripts/insert_dummy_jobs.py
+++ /dev/null
@@ -1,81 +0,0 @@
-#!/usr/bin/env python3
-import sys
-import os
-from datetime import datetime, date
-
-# ํ๋ก์ ํธ ๋ฃจํธ๋ฅผ Python path์ ์ถ๊ฐ
-sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
-
-try:
- from app.database import engine
- from sqlalchemy import text
- print("โ
๋ฐ์ดํฐ๋ฒ ์ด์ค ์ฐ๊ฒฐ ์ฑ๊ณต")
-except ImportError as e:
- print(f"โ ์ํฌํธ ์คํจ: {e}")
- sys.exit(1)
-
-def insert_dummy_data():
- dummy_jobs = [
- {
- 'job_no': 'J24-001',
- 'job_name': '์ธ์ฐ SK์๋์ง ์ ์ ์์ค ์ฆ์ค ๋ฐฐ๊ด๊ณต์ฌ',
- 'client_name': '์ผ์ฑ์์ง๋์ด๋ง',
- 'end_user': 'SK์๋์ง',
- 'epc_company': '์ผ์ฑ์์ง๋์ด๋ง',
- 'project_site': '์ธ์ฐ๊ด์ญ์ ์จ์ฐ๊ณต๋จ',
- 'contract_date': '2024-03-15',
- 'delivery_date': '2024-08-30',
- 'delivery_terms': 'FOB ์ธ์ฐํญ',
- 'description': '์ ์ ์์ค ์ฆ์ค์ ์ํ ๋ฐฐ๊ด ์์ฌ ๊ณต๊ธ',
- 'created_by': 'admin'
- },
- {
- 'job_no': 'J24-002',
- 'job_name': 'ํฌ์ค์ฝ ๊ด์ ์ ์ฒ ์ ๋ฐฐ๊ด ์ ๋น๊ณต์ฌ',
- 'client_name': 'ํฌ์ค์ฝ',
- 'end_user': 'ํฌ์ค์ฝ',
- 'epc_company': None,
- 'project_site': '์ ๋จ ๊ด์์ ํฌ์ค์ฝ ์ ์ฒ ์',
- 'contract_date': '2024-04-02',
- 'delivery_date': '2024-07-15',
- 'delivery_terms': 'DDP ๊ด์์ ์ฒ ์',
- 'description': '์ ์ฒ ์ ์ ๊ธฐ ์ ๋น์ฉ ๋ฐฐ๊ด ๋ถํ',
- 'created_by': 'admin'
- }
- ]
-
- try:
- with engine.connect() as conn:
- # ๊ธฐ์กด ๋๋ฏธ ๋ฐ์ดํฐ ์ญ์
- conn.execute(text("DELETE FROM jobs WHERE job_no IN ('J24-001', 'J24-002')"))
-
- # ์ ๋ฐ์ดํฐ ์ฝ์
- for job in dummy_jobs:
- query = text("""
- INSERT INTO jobs (
- job_no, job_name, client_name, end_user, epc_company,
- project_site, contract_date, delivery_date, delivery_terms,
- description, created_by, is_active
- ) VALUES (
- :job_no, :job_name, :client_name, :end_user, :epc_company,
- :project_site, :contract_date, :delivery_date, :delivery_terms,
- :description, :created_by, :is_active
- )
- """)
-
- conn.execute(query, {**job, 'is_active': True})
- print(f"โ
{job['job_no']}: {job['job_name']}")
-
- conn.commit()
- print(f"\n๐ {len(dummy_jobs)}๊ฐ ๋๋ฏธ Job ์์ฑ ์๋ฃ!")
-
- # ํ์ธ
- result = conn.execute(text("SELECT job_no, job_name, client_name FROM jobs"))
- for row in result:
- print(f" โข {row[0]}: {row[1]} ({row[2]})")
-
- except Exception as e:
- print(f"โ ์ค๋ฅ: {e}")
-
-if __name__ == "__main__":
- insert_dummy_data()
diff --git a/backend/temp_main_update.py b/backend/temp_main_update.py
deleted file mode 100644
index aa69547..0000000
--- a/backend/temp_main_update.py
+++ /dev/null
@@ -1,5 +0,0 @@
-# main.py์ ์ถ๊ฐํ import
-from .api import spools
-
-# app.include_router ์ถ๊ฐ
-app.include_router(spools.router, prefix="/api/spools", tags=["์คํ ๊ด๋ฆฌ"])
diff --git a/backend/temp_new_upload.py b/backend/temp_new_upload.py
deleted file mode 100644
index 966a99d..0000000
--- a/backend/temp_new_upload.py
+++ /dev/null
@@ -1,120 +0,0 @@
-@router.post("/upload")
-async def upload_file(
- file: UploadFile = File(...),
- job_no: str = Form(...),
- revision: str = Form("Rev.0"),
- db: Session = Depends(get_db)
-):
- # 1. Job ๊ฒ์ฆ (์๋ก ์ถ๊ฐ!)
- job_validation = await validate_job_exists(job_no, db)
- if not job_validation["valid"]:
- raise HTTPException(
- status_code=400,
- detail=f"Job ์ค๋ฅ: {job_validation['error']}"
- )
-
- job_info = job_validation["job"]
- print(f"โ
Job ๊ฒ์ฆ ์๋ฃ: {job_info['job_no']} - {job_info['job_name']}")
-
- # 2. ํ์ผ ๊ฒ์ฆ
- if not validate_file_extension(file.filename):
- raise HTTPException(
- status_code=400,
- detail=f"์ง์ํ์ง ์๋ ํ์ผ ํ์์
๋๋ค. ํ์ฉ๋ ํ์ฅ์: {', '.join(ALLOWED_EXTENSIONS)}"
- )
-
- if file.size and file.size > 10 * 1024 * 1024:
- raise HTTPException(status_code=400, detail="ํ์ผ ํฌ๊ธฐ๋ 10MB๋ฅผ ์ด๊ณผํ ์ ์์ต๋๋ค")
-
- # 3. ํ์ผ ์ ์ฅ
- unique_filename = generate_unique_filename(file.filename)
- file_path = UPLOAD_DIR / unique_filename
-
- try:
- with open(file_path, "wb") as buffer:
- shutil.copyfileobj(file.file, buffer)
- except Exception as e:
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ ์ฅ ์คํจ: {str(e)}")
-
- # 4. ํ์ผ ํ์ฑ ๋ฐ ์์ฌ ์ถ์ถ
- try:
- materials_data = parse_file_data(str(file_path))
- parsed_count = len(materials_data)
-
- # ํ์ผ ์ ๋ณด ์ ์ฅ (job_no ์ฌ์ฉ!)
- file_insert_query = text("""
- INSERT INTO files (filename, original_filename, file_path, job_no, revision, description, file_size, parsed_count, is_active)
- VALUES (:filename, :original_filename, :file_path, :job_no, :revision, :description, :file_size, :parsed_count, :is_active)
- RETURNING id
- """)
-
- file_result = db.execute(file_insert_query, {
- "filename": unique_filename,
- "original_filename": file.filename,
- "file_path": str(file_path),
- "job_no": job_no, # job_no ์ฌ์ฉ!
- "revision": revision,
- "description": f"BOM ํ์ผ - {parsed_count}๊ฐ ์์ฌ ({job_info['job_name']})",
- "file_size": file.size,
- "parsed_count": parsed_count,
- "is_active": True
- })
-
- file_id = file_result.fetchone()[0]
-
- # ์์ฌ ๋ฐ์ดํฐ ์ ์ฅ
- materials_inserted = 0
- for material_data in materials_data:
- material_insert_query = text("""
- INSERT INTO materials (
- file_id, original_description, quantity, unit, size_spec,
- material_grade, line_number, row_number, classified_category,
- classification_confidence, is_verified, created_at
- )
- VALUES (
- :file_id, :original_description, :quantity, :unit, :size_spec,
- :material_grade, :line_number, :row_number, :classified_category,
- :classification_confidence, :is_verified, :created_at
- )
- """)
-
- db.execute(material_insert_query, {
- "file_id": file_id,
- "original_description": material_data["original_description"],
- "quantity": material_data["quantity"],
- "unit": material_data["unit"],
- "size_spec": material_data["size_spec"],
- "material_grade": material_data["material_grade"],
- "line_number": material_data["line_number"],
- "row_number": material_data["row_number"],
- "classified_category": None,
- "classification_confidence": None,
- "is_verified": False,
- "created_at": datetime.now()
- })
- materials_inserted += 1
-
- db.commit()
-
- return {
- "success": True,
- "message": f"Job '{job_info['job_name']}'์ BOM ํ์ผ ์
๋ก๋ ์๋ฃ!",
- "job": {
- "job_no": job_info["job_no"],
- "job_name": job_info["job_name"],
- "status": job_info["status"]
- },
- "file": {
- "id": file_id,
- "original_filename": file.filename,
- "parsed_count": parsed_count,
- "saved_count": materials_inserted
- },
- "sample_materials": materials_data[:3] if materials_data else []
- }
-
- except Exception as e:
- db.rollback()
- if os.path.exists(file_path):
- os.remove(file_path)
- raise HTTPException(status_code=500, detail=f"ํ์ผ ์ฒ๋ฆฌ ์คํจ: {str(e)}")
diff --git a/backend/temp_upload_fix.py b/backend/temp_upload_fix.py
deleted file mode 100644
index 5bbf33b..0000000
--- a/backend/temp_upload_fix.py
+++ /dev/null
@@ -1,13 +0,0 @@
-# upload ํจ์์ ์ถ๊ฐํ Job ๊ฒ์ฆ ๋ก์ง
-
-# Form ํ๋ผ๋ฏธํฐ ๋ฐ์ ์งํ์ ์ถ๊ฐ:
-# Job ๊ฒ์ฆ
-job_validation = await validate_job_exists(job_no, db)
-if not job_validation["valid"]:
- raise HTTPException(
- status_code=400,
- detail=f"Job ์ค๋ฅ: {job_validation['error']}"
- )
-
-job_info = job_validation["job"]
-print(f"โ
Job ๊ฒ์ฆ ์๋ฃ: {job_info['job_no']} - {job_info['job_name']}")
diff --git a/backend/test_bom.csv b/backend/test_bom.csv
deleted file mode 100644
index 15b983f..0000000
--- a/backend/test_bom.csv
+++ /dev/null
@@ -1,5 +0,0 @@
-Description,Quantity,Unit,Size
-"PIPE ASTM A106 GR.B",10,EA,4"
-"ELBOW 90ยฐ ASTM A234",5,EA,4"
-"VALVE GATE ASTM A216",2,EA,4"
-"FLANGE WELD NECK",8,EA,4"
diff --git a/backend/test_main_red_nom.py b/backend/test_main_red_nom.py
deleted file mode 100644
index 126826c..0000000
--- a/backend/test_main_red_nom.py
+++ /dev/null
@@ -1,89 +0,0 @@
-#!/usr/bin/env python3
-"""
-main_nom, red_nom ๊ธฐ๋ฅ ํ
์คํธ ์คํฌ๋ฆฝํธ
-"""
-
-import sys
-import os
-sys.path.append(os.path.dirname(os.path.abspath(__file__)))
-
-from app.services.fitting_classifier import classify_fitting
-from app.services.flange_classifier import classify_flange
-
-def test_main_red_nom():
- """main_nom๊ณผ red_nom ๋ถ๋ฅ ํ
์คํธ"""
-
- print("๐ง main_nom/red_nom ๋ถ๋ฅ ํ
์คํธ ์์!")
- print("=" * 60)
-
- test_cases = [
- {
- "name": "์ผ๋ฐ TEE (๋์ผ ์ฌ์ด์ฆ)",
- "description": "TEE, SCH 40, ASTM A234 GR WPB",
- "main_nom": "4\"",
- "red_nom": None,
- "expected": "EQUAL TEE"
- },
- {
- "name": "๋ฆฌ๋์ฑ TEE (๋ค๋ฅธ ์ฌ์ด์ฆ)",
- "description": "TEE RED, SCH 40 x SCH 40, ASTM A234 GR WPB",
- "main_nom": "4\"",
- "red_nom": "2\"",
- "expected": "REDUCING TEE"
- },
- {
- "name": "๋์ฌ ๋ฆฌ๋์",
- "description": "RED CONC, SCH 40 x SCH 40, ASTM A234 GR WPB",
- "main_nom": "6\"",
- "red_nom": "4\"",
- "expected": "CONCENTRIC REDUCER"
- },
- {
- "name": "๋ฆฌ๋์ฑ ํ๋์ง",
- "description": "FLG REDUCING, 300LB, ASTM A105",
- "main_nom": "6\"",
- "red_nom": "4\"",
- "expected": "REDUCING FLANGE"
- }
- ]
-
- for i, test in enumerate(test_cases, 1):
- print(f"\n{i}. {test['name']}")
- print(f" ์ค๋ช
: {test['description']}")
- print(f" MAIN_NOM: {test['main_nom']}")
- print(f" RED_NOM: {test['red_nom']}")
-
- # ํผํ
๋ถ๋ฅ ํ
์คํธ
- fitting_result = classify_fitting(
- "",
- test['description'],
- test['main_nom'],
- test['red_nom']
- )
-
- print(f" ๐ง FITTING ๋ถ๋ฅ ๊ฒฐ๊ณผ:")
- print(f" ์นดํ
๊ณ ๋ฆฌ: {fitting_result.get('category')}")
- print(f" ํ์
: {fitting_result.get('fitting_type', {}).get('type')}")
- print(f" ์๋ธํ์
: {fitting_result.get('fitting_type', {}).get('subtype')}")
- print(f" ์ ๋ขฐ๋: {fitting_result.get('overall_confidence', 0):.2f}")
-
- # ์ฌ์ด์ฆ ์ ๋ณด ํ์ธ
- size_info = fitting_result.get('size_info', {})
- print(f" ๋ฉ์ธ ์ฌ์ด์ฆ: {size_info.get('main_size')}")
- print(f" ์ถ์ ์ฌ์ด์ฆ: {size_info.get('reduced_size')}")
- print(f" ์ฌ์ด์ฆ ์ค๋ช
: {size_info.get('size_description')}")
-
- # RED_NOM์ด ์๋ ๊ฒฝ์ฐ REDUCING ๋ถ๋ฅ ํ์ธ
- if test['red_nom']:
- fitting_type = fitting_result.get('fitting_type', {})
- if 'REDUCING' in fitting_type.get('subtype', '').upper():
- print(f" โ
REDUCING ํ์
์ ์ ์ธ์!")
- else:
- print(f" โ REDUCING ํ์
์ธ์ ์คํจ")
-
- print("-" * 50)
-
- print("\n๐ฏ ํ
์คํธ ์๋ฃ!")
-
-if __name__ == "__main__":
- test_main_red_nom()
\ No newline at end of file
diff --git a/backend/test_mixed_bom.csv b/backend/test_mixed_bom.csv
deleted file mode 100644
index ccab0b0..0000000
--- a/backend/test_mixed_bom.csv
+++ /dev/null
@@ -1,6 +0,0 @@
-Description,Quantity,Unit,Size
-"PIPE ASTM A106 GR.B",10,EA,4"
-"GATE VALVE ASTM A216",2,EA,4"
-"FLANGE WELD NECK RF",8,EA,4"
-"90 DEG ELBOW",5,EA,4"
-"GASKET SPIRAL WOUND",4,EA,4"
diff --git a/backend/test_sample.csv b/backend/test_sample.csv
deleted file mode 100644
index 76278ff..0000000
--- a/backend/test_sample.csv
+++ /dev/null
@@ -1,6 +0,0 @@
-description,qty,main_nom,red_nom,length
-"TEE EQUAL, SCH 40, ASTM A234 GR WPB",2,4",,"
-"TEE RED, SCH 40 x SCH 40, ASTM A234 GR WPB",1,4",2","
-"RED CONC, SCH 40 x SCH 40, ASTM A234 GR WPB",1,6",4","
-"90 LR ELL, SCH 40, ASTM A234 GR WPB, SMLS",4,3",,"
-"PIPE SMLS, SCH 40, ASTM A106 GR B",1,2",,6000
\ No newline at end of file
diff --git a/database/init/20_purchase_confirmations.sql b/database/init/20_purchase_confirmations.sql
new file mode 100644
index 0000000..1084721
--- /dev/null
+++ b/database/init/20_purchase_confirmations.sql
@@ -0,0 +1,72 @@
+-- ๊ตฌ๋งค ์๋ ํ์ ๊ด๋ จ ํ
์ด๋ธ ์์ฑ
+
+-- 1. ๊ตฌ๋งค ํ์ ๋ง์คํฐ ํ
์ด๋ธ
+CREATE TABLE IF NOT EXISTS purchase_confirmations (
+ id SERIAL PRIMARY KEY,
+ job_no VARCHAR(50) NOT NULL,
+ file_id INTEGER REFERENCES files(id),
+ bom_name VARCHAR(255) NOT NULL,
+ revision VARCHAR(50) NOT NULL DEFAULT 'Rev.0',
+ confirmed_at TIMESTAMP NOT NULL,
+ confirmed_by VARCHAR(100) NOT NULL,
+ is_active BOOLEAN NOT NULL DEFAULT TRUE,
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+ updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+);
+
+-- 2. ํ์ ๋ ๊ตฌ๋งค ํ๋ชฉ ํ
์ด๋ธ
+CREATE TABLE IF NOT EXISTS confirmed_purchase_items (
+ id SERIAL PRIMARY KEY,
+ confirmation_id INTEGER REFERENCES purchase_confirmations(id) ON DELETE CASCADE,
+ item_code VARCHAR(100) NOT NULL,
+ category VARCHAR(50) NOT NULL,
+ specification TEXT,
+ size VARCHAR(100),
+ material VARCHAR(100),
+ bom_quantity DECIMAL(15,3) NOT NULL DEFAULT 0,
+ calculated_qty DECIMAL(15,3) NOT NULL DEFAULT 0,
+ unit VARCHAR(20) NOT NULL DEFAULT 'EA',
+ safety_factor DECIMAL(5,3) NOT NULL DEFAULT 1.0,
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+);
+
+-- 3. files ํ
์ด๋ธ์ ํ์ ๊ด๋ จ ์ปฌ๋ผ ์ถ๊ฐ (์ด๋ฏธ ์์ผ๋ฉด ๋ฌด์)
+ALTER TABLE files
+ADD COLUMN IF NOT EXISTS purchase_confirmed BOOLEAN DEFAULT FALSE,
+ADD COLUMN IF NOT EXISTS confirmed_at TIMESTAMP,
+ADD COLUMN IF NOT EXISTS confirmed_by VARCHAR(100);
+
+-- ์ธ๋ฑ์ค ์์ฑ
+CREATE INDEX IF NOT EXISTS idx_purchase_confirmations_job_revision
+ON purchase_confirmations(job_no, revision, is_active);
+
+CREATE INDEX IF NOT EXISTS idx_confirmed_purchase_items_confirmation
+ON confirmed_purchase_items(confirmation_id);
+
+CREATE INDEX IF NOT EXISTS idx_confirmed_purchase_items_category
+ON confirmed_purchase_items(category);
+
+CREATE INDEX IF NOT EXISTS idx_files_purchase_confirmed
+ON files(purchase_confirmed);
+
+-- ์ฝ๋ฉํธ ์ถ๊ฐ
+COMMENT ON TABLE purchase_confirmations IS '๊ตฌ๋งค ์๋ ํ์ ๋ง์คํฐ ํ
์ด๋ธ';
+COMMENT ON TABLE confirmed_purchase_items IS 'ํ์ ๋ ๊ตฌ๋งค ํ๋ชฉ ์์ธ ํ
์ด๋ธ';
+COMMENT ON COLUMN files.purchase_confirmed IS '๊ตฌ๋งค ์๋ ํ์ ์ฌ๋ถ';
+COMMENT ON COLUMN files.confirmed_at IS '๊ตฌ๋งค ์๋ ํ์ ์๊ฐ';
+COMMENT ON COLUMN files.confirmed_by IS '๊ตฌ๋งค ์๋ ํ์ ์';
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docker-backup/docker-compose.override.yml b/docker-backup/docker-compose.override.yml
new file mode 100644
index 0000000..2915979
--- /dev/null
+++ b/docker-backup/docker-compose.override.yml
@@ -0,0 +1,33 @@
+# ๊ฐ๋ฐ ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋ (๊ธฐ๋ณธ๊ฐ)
+# docker-compose up ์ ์๋์ผ๋ก ์ ์ฉ๋จ
+# version: '3.8' # Docker Compose v2์์๋ version ํ๋๊ฐ ์ ํ์ฌํญ
+
+services:
+ backend:
+ volumes:
+ # ๊ฐ๋ฐ ์ ์ฝ๋ ๋ณ๊ฒฝ ์ค์๊ฐ ๋ฐ์
+ - ./backend:/app
+ environment:
+ - DEBUG=true
+ - RELOAD=true
+ - LOG_LEVEL=DEBUG
+
+ frontend:
+ environment:
+ - VITE_API_URL=http://localhost:18000
+ build:
+ args:
+ - VITE_API_URL=http://localhost:18000
+
+ # ๊ฐ๋ฐ ํ๊ฒฝ์์๋ ๋ชจ๋ ํฌํธ๋ฅผ ์ธ๋ถ์ ๋
ธ์ถ
+ postgres:
+ ports:
+ - "5432:5432"
+
+ redis:
+ ports:
+ - "6379:6379"
+
+ pgadmin:
+ ports:
+ - "5050:80"
diff --git a/docker-backup/docker-compose.prod.yml b/docker-backup/docker-compose.prod.yml
new file mode 100644
index 0000000..ff1d820
--- /dev/null
+++ b/docker-backup/docker-compose.prod.yml
@@ -0,0 +1,55 @@
+# ํ๋ก๋์
ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋
+version: '3.8'
+
+services:
+ backend:
+ environment:
+ - ENVIRONMENT=production
+ - DEBUG=false
+ - RELOAD=false
+ - LOG_LEVEL=INFO
+ # ํ๋ก๋์
์์๋ ์ฝ๋ ๋ณผ๋ฅจ ๋ง์ดํธ ์ ๊ฑฐ
+ volumes:
+ - ./backend/uploads:/app/uploads
+
+ frontend:
+ environment:
+ - VITE_API_URL=/api
+ build:
+ args:
+ - VITE_API_URL=/api
+
+ # ํ๋ก๋์
์ฉ ๋ฆฌ๋ฒ์ค ํ๋ก์
+ nginx:
+ image: nginx:alpine
+ container_name: tk-mp-nginx
+ restart: unless-stopped
+ ports:
+ - "80:80"
+ - "443:443"
+ volumes:
+ - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
+ - ./nginx/ssl:/etc/nginx/ssl:ro
+ depends_on:
+ - frontend
+ - backend
+ networks:
+ - tk-mp-network
+
+ # ํ๋ก๋์
์์๋ ์ธ๋ถ ํฌํธ ์ ๊ทผ ์ฐจ๋จ
+ postgres:
+ ports: []
+
+ redis:
+ ports: []
+
+ backend:
+ ports: []
+
+ frontend:
+ ports: []
+
+ # pgAdmin์ ํ๋ก๋์
์์ ๋นํ์ฑํ
+ pgadmin:
+ profiles:
+ - disabled
\ No newline at end of file
diff --git a/docker-backup/docker-compose.synology.yml b/docker-backup/docker-compose.synology.yml
new file mode 100644
index 0000000..9be78f3
--- /dev/null
+++ b/docker-backup/docker-compose.synology.yml
@@ -0,0 +1,57 @@
+# ์๋๋ก์ง NAS ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋
+version: '3.8'
+
+services:
+ postgres:
+ container_name: tk-mp-postgres-synology
+ ports:
+ - "15432:5432"
+ volumes:
+ - tk_mp_postgres_data:/var/lib/postgresql/data
+ - ./database/init:/docker-entrypoint-initdb.d
+
+ redis:
+ container_name: tk-mp-redis-synology
+ ports:
+ - "16379:6379"
+ volumes:
+ - tk_mp_redis_data:/data
+
+ backend:
+ container_name: tk-mp-backend-synology
+ ports:
+ - "10080:8000"
+ environment:
+ - ENVIRONMENT=synology
+ - DEBUG=false
+ - RELOAD=false
+ - LOG_LEVEL=INFO
+ - DATABASE_URL=postgresql://${POSTGRES_USER:-tkmp_user}:${POSTGRES_PASSWORD:-tkmp_password_2025}@postgres:5432/${POSTGRES_DB:-tk_mp_bom}
+ - REDIS_URL=redis://redis:6379
+ volumes:
+ - tk_mp_uploads:/app/uploads
+
+ frontend:
+ container_name: tk-mp-frontend-synology
+ ports:
+ - "10173:3000"
+ environment:
+ - VITE_API_URL=http://localhost:10080
+ build:
+ args:
+ - VITE_API_URL=http://localhost:10080
+
+ # ์๋๋ก์ง์์๋ pgAdmin ํฌํธ ๋ณ๊ฒฝ
+ pgadmin:
+ container_name: tk-mp-pgadmin-synology
+ ports:
+ - "15050:80"
+
+# ์๋๋ก์ง์ฉ ๋ช
๋ช
๋ ๋ณผ๋ฅจ
+volumes:
+ tk_mp_postgres_data:
+ external: false
+ tk_mp_redis_data:
+ external: false
+ tk_mp_uploads:
+ external: false
\ No newline at end of file
diff --git a/docker-backup/docker-compose.yml b/docker-backup/docker-compose.yml
new file mode 100644
index 0000000..64da84a
--- /dev/null
+++ b/docker-backup/docker-compose.yml
@@ -0,0 +1,124 @@
+# TK-MP-Project Docker Compose ์ค์
+# ๊ธฐ๋ณธ ์ค์ (๊ฐ๋ฐ ํ๊ฒฝ ๊ธฐ์ค)
+# version: '3.8' # Docker Compose v2์์๋ version ํ๋๊ฐ ์ ํ์ฌํญ
+
+services:
+ # PostgreSQL ๋ฐ์ดํฐ๋ฒ ์ด์ค
+ postgres:
+ image: postgres:15-alpine
+ container_name: tk-mp-postgres
+ restart: unless-stopped
+ environment:
+ POSTGRES_DB: ${POSTGRES_DB:-tk_mp_bom}
+ POSTGRES_USER: ${POSTGRES_USER:-tkmp_user}
+ POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-tkmp_password_2025}
+ POSTGRES_INITDB_ARGS: "--encoding=UTF-8 --locale=C"
+ ports:
+ - "${POSTGRES_PORT:-5432}:5432"
+ volumes:
+ - postgres_data:/var/lib/postgresql/data
+ - ./database/init:/docker-entrypoint-initdb.d
+ networks:
+ - tk-mp-network
+ healthcheck:
+ test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-tkmp_user} -d ${POSTGRES_DB:-tk_mp_bom}"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
+
+ # Redis (์บ์ ๋ฐ ์ธ์
๊ด๋ฆฌ์ฉ)
+ redis:
+ image: redis:7-alpine
+ container_name: tk-mp-redis
+ restart: unless-stopped
+ ports:
+ - "${REDIS_PORT:-6379}:6379"
+ volumes:
+ - redis_data:/data
+ networks:
+ - tk-mp-network
+ healthcheck:
+ test: ["CMD", "redis-cli", "ping"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
+
+ # ๋ฐฑ์๋ FastAPI ์๋น์ค
+ backend:
+ build:
+ context: ./backend
+ dockerfile: Dockerfile
+ container_name: tk-mp-backend
+ restart: unless-stopped
+ ports:
+ - "${BACKEND_PORT:-18000}:8000"
+ environment:
+ - DATABASE_URL=postgresql://${POSTGRES_USER:-tkmp_user}:${POSTGRES_PASSWORD:-tkmp_password_2025}@postgres:5432/${POSTGRES_DB:-tk_mp_bom}
+ - REDIS_URL=redis://redis:6379
+ - ENVIRONMENT=${ENVIRONMENT:-development}
+ - DEBUG=${DEBUG:-true}
+ - PYTHONPATH=/app
+ depends_on:
+ - postgres
+ - redis
+ networks:
+ - tk-mp-network
+ volumes:
+ - ./backend/uploads:/app/uploads
+ # ๊ฐ๋ฐ ํ๊ฒฝ์์๋ ์ฝ๋ ๋ณ๊ฒฝ ์ค์๊ฐ ๋ฐ์ (์ค๋ฒ๋ผ์ด๋์์ ์ค์ )
+ # healthcheck:
+ # test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
+ # interval: 30s
+ # timeout: 10s
+ # retries: 3
+
+ # ํ๋ก ํธ์๋ React + Nginx ์๋น์ค
+ frontend:
+ build:
+ context: ./frontend
+ dockerfile: Dockerfile
+ args:
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
+ container_name: tk-mp-frontend
+ restart: unless-stopped
+ ports:
+ - "${FRONTEND_PORT:-13000}:3000"
+ environment:
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
+ depends_on:
+ - backend
+ networks:
+ - tk-mp-network
+
+ # pgAdmin ์น ๊ด๋ฆฌ๋๊ตฌ (๊ฐ๋ฐ/ํ
์คํธ ํ๊ฒฝ์ฉ)
+ pgadmin:
+ image: dpage/pgadmin4:latest
+ container_name: tk-mp-pgadmin
+ restart: unless-stopped
+ environment:
+ PGADMIN_DEFAULT_EMAIL: ${PGADMIN_EMAIL:-admin@example.com}
+ PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_PASSWORD:-admin2025}
+ PGADMIN_CONFIG_SERVER_MODE: 'False'
+ ports:
+ - "${PGADMIN_PORT:-5050}:80"
+ volumes:
+ - pgadmin_data:/var/lib/pgadmin
+ depends_on:
+ - postgres
+ networks:
+ - tk-mp-network
+ profiles:
+ - dev
+ - test
+
+volumes:
+ postgres_data:
+ driver: local
+ pgadmin_data:
+ driver: local
+ redis_data:
+ driver: local
+
+networks:
+ tk-mp-network:
+ driver: bridge
\ No newline at end of file
diff --git a/docker-compose.dev.yml b/docker-compose.dev.yml
deleted file mode 100644
index d641d91..0000000
--- a/docker-compose.dev.yml
+++ /dev/null
@@ -1,26 +0,0 @@
-version: '3.8'
-
-# ๊ฐ๋ฐ ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋
-services:
- frontend:
- environment:
- - VITE_API_URL=http://localhost:18000
- build:
- args:
- - VITE_API_URL=http://localhost:18000
-
- backend:
- volumes:
- - ./backend:/app # ๊ฐ๋ฐ ์ ์ฝ๋ ๋ณ๊ฒฝ ์ค์๊ฐ ๋ฐ์
- environment:
- - DEBUG=True
- - RELOAD=True
-
- # ๊ฐ๋ฐ์ฉ ํฌํธ ๋งคํ
- postgres:
- ports:
- - "5432:5432"
-
- redis:
- ports:
- - "6379:6379"
\ No newline at end of file
diff --git a/docker-compose.override.yml b/docker-compose.override.yml
new file mode 100644
index 0000000..af35268
--- /dev/null
+++ b/docker-compose.override.yml
@@ -0,0 +1,34 @@
+# ๊ฐ๋ฐ ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋ (๊ธฐ๋ณธ๊ฐ)
+# docker-compose up ์ ์๋์ผ๋ก ์ ์ฉ๋จ
+# version: '3.8' # Docker Compose v2์์๋ version ํ๋๊ฐ ์ ํ์ฌํญ
+
+services:
+ backend:
+ volumes:
+ # ๊ฐ๋ฐ ์ ์ฝ๋ ๋ณ๊ฒฝ ์ค์๊ฐ ๋ฐ์
+ - ./backend:/app
+ environment:
+ - DEBUG=true
+ - RELOAD=true
+ - LOG_LEVEL=DEBUG
+
+ frontend:
+ environment:
+ - VITE_API_URL=http://localhost:18000
+ build:
+ args:
+ - VITE_API_URL=http://localhost:18000
+
+ # ๊ฐ๋ฐ ํ๊ฒฝ์์๋ ๋ชจ๋ ํฌํธ๋ฅผ ์ธ๋ถ์ ๋
ธ์ถ
+ postgres:
+ ports:
+ - "5432:5432"
+
+ redis:
+ ports:
+ - "6379:6379"
+
+ pgadmin:
+ ports:
+ - "5050:80"
+
diff --git a/docker-compose.prod.yml b/docker-compose.prod.yml
deleted file mode 100644
index 60b3557..0000000
--- a/docker-compose.prod.yml
+++ /dev/null
@@ -1,46 +0,0 @@
-version: '3.8'
-
-# ํ๋ก๋์
ํ๊ฒฝ์ฉ ์ค๋ฒ๋ผ์ด๋
-services:
- frontend:
- environment:
- - VITE_API_URL=/api
- build:
- args:
- - VITE_API_URL=/api
- # ํฌํธ๋ฅผ ์ธ๋ถ์ ๋
ธ์ถํ์ง ์์ (๋ฆฌ๋ฒ์ค ํ๋ก์ ์ฌ์ฉ)
- ports: []
-
- backend:
- environment:
- - DEBUG=False
- - RELOAD=False
- # ํฌํธ๋ฅผ ์ธ๋ถ์ ๋
ธ์ถํ์ง ์์
- ports: []
-
- # ํ๋ก๋์
์ฉ ๋ฆฌ๋ฒ์ค ํ๋ก์ (์: Nginx)
- nginx:
- image: nginx:alpine
- container_name: tk-mp-nginx
- restart: unless-stopped
- ports:
- - "80:80"
- - "443:443"
- volumes:
- - ./nginx/nginx.conf:/etc/nginx/nginx.conf
- - ./nginx/ssl:/etc/nginx/ssl # SSL ์ธ์ฆ์
- depends_on:
- - frontend
- - backend
- networks:
- - tk-mp-network
-
- # ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ ๊ทผ ์ ํ
- postgres:
- ports: [] # ์ธ๋ถ ์ ๊ทผ ์ฐจ๋จ
-
- redis:
- ports: [] # ์ธ๋ถ ์ ๊ทผ ์ฐจ๋จ
-
- pgadmin:
- ports: [] # ์ธ๋ถ ์ ๊ทผ ์ฐจ๋จ (ํ์์ SSH ํฐ๋๋ง)
\ No newline at end of file
diff --git a/docker-compose.synology.yml b/docker-compose.synology.yml
deleted file mode 100644
index d4b6921..0000000
--- a/docker-compose.synology.yml
+++ /dev/null
@@ -1,76 +0,0 @@
-version: '3.8'
-
-services:
- # PostgreSQL ๋ฐ์ดํฐ๋ฒ ์ด์ค
- tk-mp-postgres:
- image: postgres:15-alpine
- container_name: tk-mp-postgres
- restart: unless-stopped
- environment:
- POSTGRES_DB: tk_mp_bom
- POSTGRES_USER: tkmp_user
- POSTGRES_PASSWORD: tkmp_password_2025
- POSTGRES_INITDB_ARGS: "--encoding=UTF-8 --locale=C"
- ports:
- - "15432:5432"
- volumes:
- - tk_mp_postgres_data:/var/lib/postgresql/data
- - ./database/init:/docker-entrypoint-initdb.d
- networks:
- - tk-mp-network
-
- # Redis (์บ์ ๋ฐ ์ธ์
๊ด๋ฆฌ์ฉ)
- tk-mp-redis:
- image: redis:7-alpine
- container_name: tk-mp-redis
- restart: unless-stopped
- ports:
- - "16379:6379"
- volumes:
- - tk_mp_redis_data:/data
- networks:
- - tk-mp-network
-
- # ๋ฐฑ์๋ FastAPI ์๋น์ค
- tk-mp-backend:
- build:
- context: ./backend
- dockerfile: Dockerfile
- container_name: tk-mp-backend
- restart: unless-stopped
- ports:
- - "10080:10080"
- environment:
- - DATABASE_URL=postgresql://tkmp_user:tkmp_password_2025@tk-mp-postgres:5432/tk_mp_bom
- - REDIS_URL=redis://tk-mp-redis:6379
- - PYTHONPATH=/app
- depends_on:
- - tk-mp-postgres
- - tk-mp-redis
- networks:
- - tk-mp-network
- volumes:
- - tk_mp_uploads:/app/uploads
-
- # ํ๋ก ํธ์๋ Nginx ์๋น์ค
- tk-mp-frontend:
- build:
- context: ./frontend
- dockerfile: Dockerfile
- container_name: tk-mp-frontend
- restart: unless-stopped
- ports:
- - "10173:10173"
- depends_on:
- - tk-mp-backend
- networks:
- - tk-mp-network
-
-volumes:
- tk_mp_postgres_data:
- tk_mp_redis_data:
- tk_mp_uploads:
-
-networks:
- tk-mp-network:
- driver: bridge
\ No newline at end of file
diff --git a/docker-compose.unified.yml b/docker-compose.unified.yml
new file mode 100644
index 0000000..c9004c9
--- /dev/null
+++ b/docker-compose.unified.yml
@@ -0,0 +1,160 @@
+# TK-MP-Project ํตํฉ Docker Compose ์ค์
+# ํ๊ฒฝ ๋ณ์ DEPLOY_ENV๋ก ํ๊ฒฝ ๊ตฌ๋ถ: development(๊ธฐ๋ณธ), production, synology
+
+services:
+ # PostgreSQL ๋ฐ์ดํฐ๋ฒ ์ด์ค
+ postgres:
+ image: postgres:15-alpine
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-postgres${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ environment:
+ POSTGRES_DB: ${POSTGRES_DB:-tk_mp_bom}
+ POSTGRES_USER: ${POSTGRES_USER:-tkmp_user}
+ POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-tkmp_password_2025}
+ POSTGRES_INITDB_ARGS: "--encoding=UTF-8 --locale=C"
+ ports:
+ # ๊ฐ๋ฐ: 5432, ํ๋ก๋์
: ์์, ์๋๋ก์ง: 15432
+ - "${POSTGRES_EXTERNAL_PORT:-5432}:5432"
+ volumes:
+ - ${POSTGRES_DATA_VOLUME:-postgres_data}:/var/lib/postgresql/data
+ - ./database/init:/docker-entrypoint-initdb.d
+ networks:
+ - tk-mp-network
+ healthcheck:
+ test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-tkmp_user} -d ${POSTGRES_DB:-tk_mp_bom}"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
+ profiles:
+ - ${POSTGRES_PROFILE:-default}
+
+ # Redis (์บ์ ๋ฐ ์ธ์
๊ด๋ฆฌ์ฉ)
+ redis:
+ image: redis:7-alpine
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-redis${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ ports:
+ # ๊ฐ๋ฐ: 6379, ํ๋ก๋์
: ์์, ์๋๋ก์ง: 16379
+ - "${REDIS_EXTERNAL_PORT:-6379}:6379"
+ volumes:
+ - ${REDIS_DATA_VOLUME:-redis_data}:/data
+ networks:
+ - tk-mp-network
+ healthcheck:
+ test: ["CMD", "redis-cli", "ping"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
+ profiles:
+ - ${REDIS_PROFILE:-default}
+
+ # ๋ฐฑ์๋ FastAPI ์๋น์ค
+ backend:
+ build:
+ context: ./backend
+ dockerfile: Dockerfile
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-backend${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ ports:
+ # ๊ฐ๋ฐ: 18000, ํ๋ก๋์
: ์์, ์๋๋ก์ง: 10080
+ - "${BACKEND_EXTERNAL_PORT:-18000}:8000"
+ environment:
+ - DATABASE_URL=postgresql://${POSTGRES_USER:-tkmp_user}:${POSTGRES_PASSWORD:-tkmp_password_2025}@postgres:5432/${POSTGRES_DB:-tk_mp_bom}
+ - REDIS_URL=redis://redis:6379
+ - ENVIRONMENT=${DEPLOY_ENV:-development}
+ - DEBUG=${DEBUG:-true}
+ - RELOAD=${RELOAD:-true}
+ - LOG_LEVEL=${LOG_LEVEL:-DEBUG}
+ - PYTHONPATH=/app
+ depends_on:
+ - postgres
+ - redis
+ networks:
+ - tk-mp-network
+ volumes:
+ # ๊ฐ๋ฐ: ์ฝ๋ ๋ง์ดํธ, ํ๋ก๋์
/์๋๋ก์ง: ์
๋ก๋๋ง
+ - ${BACKEND_CODE_VOLUME:-./backend}:/app
+ - ${UPLOADS_VOLUME:-./backend/uploads}:/app/uploads
+ profiles:
+ - ${BACKEND_PROFILE:-default}
+
+ # ํ๋ก ํธ์๋ React + Nginx ์๋น์ค
+ frontend:
+ build:
+ context: ./frontend
+ dockerfile: Dockerfile
+ args:
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-frontend${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ ports:
+ # ๊ฐ๋ฐ: 13000, ํ๋ก๋์
: ์์, ์๋๋ก์ง: 10173
+ - "${FRONTEND_EXTERNAL_PORT:-13000}:3000"
+ environment:
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
+ depends_on:
+ - backend
+ networks:
+ - tk-mp-network
+ profiles:
+ - ${FRONTEND_PROFILE:-default}
+
+ # Nginx ๋ฆฌ๋ฒ์ค ํ๋ก์ (ํ๋ก๋์
์ ์ฉ)
+ nginx:
+ image: nginx:alpine
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-nginx${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ ports:
+ - "${NGINX_HTTP_PORT:-80}:80"
+ - "${NGINX_HTTPS_PORT:-443}:443"
+ volumes:
+ - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
+ - ./nginx/ssl:/etc/nginx/ssl:ro
+ depends_on:
+ - frontend
+ - backend
+ networks:
+ - tk-mp-network
+ profiles:
+ - production
+
+ # pgAdmin ์น ๊ด๋ฆฌ๋๊ตฌ
+ pgadmin:
+ image: dpage/pgadmin4:latest
+ container_name: ${COMPOSE_PROJECT_NAME:-tk-mp}-pgadmin${CONTAINER_SUFFIX:-}
+ restart: unless-stopped
+ environment:
+ PGADMIN_DEFAULT_EMAIL: ${PGADMIN_EMAIL:-admin@example.com}
+ PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_PASSWORD:-admin2025}
+ PGADMIN_CONFIG_SERVER_MODE: 'False'
+ ports:
+ # ๊ฐ๋ฐ: 5050, ์๋๋ก์ง: 15050, ํ๋ก๋์
: ๋นํ์ฑํ
+ - "${PGADMIN_EXTERNAL_PORT:-5050}:80"
+ volumes:
+ - ${PGADMIN_DATA_VOLUME:-pgadmin_data}:/var/lib/pgadmin
+ depends_on:
+ - postgres
+ networks:
+ - tk-mp-network
+ profiles:
+ - ${PGADMIN_PROFILE:-dev}
+
+volumes:
+ postgres_data:
+ driver: local
+ pgadmin_data:
+ driver: local
+ redis_data:
+ driver: local
+ # ์๋๋ก์ง์ฉ ๋ช
๋ช
๋ ๋ณผ๋ฅจ
+ tk_mp_postgres_data:
+ external: false
+ tk_mp_redis_data:
+ external: false
+ tk_mp_uploads:
+ external: false
+
+networks:
+ tk-mp-network:
+ driver: bridge
+
diff --git a/docker-compose.yml b/docker-compose.yml
index 840c6e6..8cd88f5 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -1,4 +1,6 @@
-version: '3.8'
+# TK-MP-Project Docker Compose ์ค์
+# ๊ธฐ๋ณธ ์ค์ (๊ฐ๋ฐ ํ๊ฒฝ ๊ธฐ์ค)
+# version: '3.8' # Docker Compose v2์์๋ version ํ๋๊ฐ ์ ํ์ฌํญ
services:
# PostgreSQL ๋ฐ์ดํฐ๋ฒ ์ด์ค
@@ -7,35 +9,22 @@ services:
container_name: tk-mp-postgres
restart: unless-stopped
environment:
- POSTGRES_DB: tk_mp_bom
- POSTGRES_USER: tkmp_user
- POSTGRES_PASSWORD: tkmp_password_2025
+ POSTGRES_DB: ${POSTGRES_DB:-tk_mp_bom}
+ POSTGRES_USER: ${POSTGRES_USER:-tkmp_user}
+ POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-tkmp_password_2025}
POSTGRES_INITDB_ARGS: "--encoding=UTF-8 --locale=C"
ports:
- - "5432:5432"
+ - "${POSTGRES_PORT:-5432}:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
- ./database/init:/docker-entrypoint-initdb.d
networks:
- tk-mp-network
-
- # pgAdmin ์น ๊ด๋ฆฌ๋๊ตฌ
- pgadmin:
- image: dpage/pgadmin4:latest
- container_name: tk-mp-pgadmin
- restart: unless-stopped
- environment:
- PGADMIN_DEFAULT_EMAIL: admin@example.com
- PGADMIN_DEFAULT_PASSWORD: admin2025
- PGADMIN_CONFIG_SERVER_MODE: 'False'
- ports:
- - "5050:80"
- volumes:
- - pgadmin_data:/var/lib/pgadmin
- depends_on:
- - postgres
- networks:
- - tk-mp-network
+ healthcheck:
+ test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-tkmp_user} -d ${POSTGRES_DB:-tk_mp_bom}"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
# Redis (์บ์ ๋ฐ ์ธ์
๊ด๋ฆฌ์ฉ)
redis:
@@ -43,11 +32,16 @@ services:
container_name: tk-mp-redis
restart: unless-stopped
ports:
- - "6379:6379"
+ - "${REDIS_PORT:-6379}:6379"
volumes:
- redis_data:/data
networks:
- tk-mp-network
+ healthcheck:
+ test: ["CMD", "redis-cli", "ping"]
+ interval: 30s
+ timeout: 10s
+ retries: 3
# ๋ฐฑ์๋ FastAPI ์๋น์ค
backend:
@@ -57,10 +51,13 @@ services:
container_name: tk-mp-backend
restart: unless-stopped
ports:
- - "18000:8000"
+ - "${BACKEND_PORT:-18000}:8000"
environment:
- - DATABASE_URL=postgresql://tkmp_user:tkmp_password_2025@postgres:5432/tk_mp_bom
+ - DATABASE_URL=postgresql://${POSTGRES_USER:-tkmp_user}:${POSTGRES_PASSWORD:-tkmp_password_2025}@postgres:5432/${POSTGRES_DB:-tk_mp_bom}
- REDIS_URL=redis://redis:6379
+ - ENVIRONMENT=${ENVIRONMENT:-development}
+ - DEBUG=${DEBUG:-true}
+ - PYTHONPATH=/app
depends_on:
- postgres
- redis
@@ -68,25 +65,52 @@ services:
- tk-mp-network
volumes:
- ./backend/uploads:/app/uploads
+ # ๊ฐ๋ฐ ํ๊ฒฝ์์๋ ์ฝ๋ ๋ณ๊ฒฝ ์ค์๊ฐ ๋ฐ์ (์ค๋ฒ๋ผ์ด๋์์ ์ค์ )
+ # healthcheck:
+ # test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
+ # interval: 30s
+ # timeout: 10s
+ # retries: 3
- # ํ๋ก ํธ์๋ Nginx ์๋น์ค
+ # ํ๋ก ํธ์๋ React + Nginx ์๋น์ค
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
args:
- - VITE_API_URL=${VITE_API_URL:-/api}
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
container_name: tk-mp-frontend
restart: unless-stopped
ports:
- - "13000:3000"
+ - "${FRONTEND_PORT:-13000}:3000"
environment:
- - VITE_API_URL=${VITE_API_URL:-/api}
+ - VITE_API_URL=${VITE_API_URL:-http://localhost:18000}
depends_on:
- backend
networks:
- tk-mp-network
+ # pgAdmin ์น ๊ด๋ฆฌ๋๊ตฌ (๊ฐ๋ฐ/ํ
์คํธ ํ๊ฒฝ์ฉ)
+ pgadmin:
+ image: dpage/pgadmin4:latest
+ container_name: tk-mp-pgadmin
+ restart: unless-stopped
+ environment:
+ PGADMIN_DEFAULT_EMAIL: ${PGADMIN_EMAIL:-admin@example.com}
+ PGLADMIN_DEFAULT_PASSWORD: ${PGADMIN_PASSWORD:-admin2025}
+ PGADMIN_CONFIG_SERVER_MODE: 'False'
+ ports:
+ - "${PGADMIN_PORT:-5050}:80"
+ volumes:
+ - pgadmin_data:/var/lib/pgadmin
+ depends_on:
+ - postgres
+ networks:
+ - tk-mp-network
+ profiles:
+ - dev
+ - test
+
volumes:
postgres_data:
driver: local
@@ -97,4 +121,4 @@ volumes:
networks:
tk-mp-network:
- driver: bridge
+ driver: bridge
\ No newline at end of file
diff --git a/docker-run.sh b/docker-run.sh
new file mode 100755
index 0000000..0e45534
--- /dev/null
+++ b/docker-run.sh
@@ -0,0 +1,104 @@
+#!/bin/bash
+
+# TK-MP-Project Docker ์คํ ์คํฌ๋ฆฝํธ
+# ์ฌ์ฉ๋ฒ: ./docker-run.sh [ํ๊ฒฝ] [๋ช
๋ น]
+# ํ๊ฒฝ: dev (๊ธฐ๋ณธ), prod, synology
+# ๋ช
๋ น: up, down, build, logs, ps
+
+set -e
+
+# ๊ธฐ๋ณธ๊ฐ ์ค์
+ENV=${1:-dev}
+CMD=${2:-up}
+
+# ํ๊ฒฝ๋ณ ์ค์ ํ์ผ ๊ฒฝ๋ก
+case $ENV in
+ "dev"|"development")
+ ENV_FILE="env.development"
+ COMPOSE_FILE="docker-compose.yml"
+ echo "๐ง ๊ฐ๋ฐ ํ๊ฒฝ์ผ๋ก ์คํํฉ๋๋ค..."
+ ;;
+ "prod"|"production")
+ ENV_FILE="env.production"
+ COMPOSE_FILE="docker-compose.yml"
+ echo "๐ ํ๋ก๋์
ํ๊ฒฝ์ผ๋ก ์คํํฉ๋๋ค..."
+ ;;
+ "synology"|"nas")
+ ENV_FILE="env.synology"
+ COMPOSE_FILE="docker-compose.yml"
+ echo "๐ ์๋๋ก์ง NAS ํ๊ฒฝ์ผ๋ก ์คํํฉ๋๋ค..."
+ ;;
+ *)
+ echo "โ ์ง์ํ์ง ์๋ ํ๊ฒฝ์
๋๋ค: $ENV"
+ echo "์ฌ์ฉ ๊ฐ๋ฅํ ํ๊ฒฝ: dev, prod, synology"
+ exit 1
+ ;;
+esac
+
+# ํ๊ฒฝ ํ์ผ ์กด์ฌ ํ์ธ
+if [ ! -f "$ENV_FILE" ]; then
+ echo "โ ํ๊ฒฝ ํ์ผ์ ์ฐพ์ ์ ์์ต๋๋ค: $ENV_FILE"
+ exit 1
+fi
+
+# Docker Compose ํ์ผ ์กด์ฌ ํ์ธ
+if [ ! -f "$COMPOSE_FILE" ]; then
+ echo "โ Docker Compose ํ์ผ์ ์ฐพ์ ์ ์์ต๋๋ค: $COMPOSE_FILE"
+ exit 1
+fi
+
+echo "๐ ํ๊ฒฝ ํ์ผ: $ENV_FILE"
+echo "๐ณ Compose ํ์ผ: $COMPOSE_FILE"
+
+# Docker Compose ๋ช
๋ น ์คํ
+case $CMD in
+ "up")
+ echo "๐ ์ปจํ
์ด๋๋ฅผ ์์ํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" up -d
+ echo "โ
์ปจํ
์ด๋๊ฐ ์์๋์์ต๋๋ค!"
+ echo ""
+ echo "๐ ์๋น์ค ์ํ:"
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" ps
+ ;;
+ "down")
+ echo "๐ ์ปจํ
์ด๋๋ฅผ ์ค์งํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" down
+ echo "โ
์ปจํ
์ด๋๊ฐ ์ค์ง๋์์ต๋๋ค!"
+ ;;
+ "build")
+ echo "๐จ ์ด๋ฏธ์ง๋ฅผ ๋น๋ํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" build
+ echo "โ
์ด๋ฏธ์ง ๋น๋๊ฐ ์๋ฃ๋์์ต๋๋ค!"
+ ;;
+ "rebuild")
+ echo "๐จ ์ด๋ฏธ์ง๋ฅผ ์ฌ๋น๋ํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" build --no-cache
+ echo "โ
์ด๋ฏธ์ง ์ฌ๋น๋๊ฐ ์๋ฃ๋์์ต๋๋ค!"
+ ;;
+ "logs")
+ echo "๐ ๋ก๊ทธ๋ฅผ ํ์ธํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" logs -f
+ ;;
+ "ps"|"status")
+ echo "๐ ์๋น์ค ์ํ:"
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" ps
+ ;;
+ "restart")
+ echo "๐ ์ปจํ
์ด๋๋ฅผ ์ฌ์์ํฉ๋๋ค..."
+ docker-compose --env-file "$ENV_FILE" -f "$COMPOSE_FILE" restart
+ echo "โ
์ปจํ
์ด๋๊ฐ ์ฌ์์๋์์ต๋๋ค!"
+ ;;
+ *)
+ echo "โ ์ง์ํ์ง ์๋ ๋ช
๋ น์
๋๋ค: $CMD"
+ echo "์ฌ์ฉ ๊ฐ๋ฅํ ๋ช
๋ น: up, down, build, rebuild, logs, ps, restart"
+ exit 1
+ ;;
+esac
+
+echo ""
+echo "๐ฏ ์ฌ์ฉ๋ฒ ์์:"
+echo " ๊ฐ๋ฐ ํ๊ฒฝ ์์: ./docker-run.sh dev up"
+echo " ํ๋ก๋์
ํ๊ฒฝ ์์: ./docker-run.sh prod up"
+echo " ์๋๋ก์ง ํ๊ฒฝ ์์: ./docker-run.sh synology up"
+echo " ๋ก๊ทธ ํ์ธ: ./docker-run.sh dev logs"
+echo " ์ํ ํ์ธ: ./docker-run.sh dev ps"
diff --git a/env.development b/env.development
new file mode 100644
index 0000000..e793a03
--- /dev/null
+++ b/env.development
@@ -0,0 +1,43 @@
+# ๊ฐ๋ฐ ํ๊ฒฝ ์ค์
+DEPLOY_ENV=development
+COMPOSE_PROJECT_NAME=tk-mp-dev
+
+# ์ปจํ
์ด๋ ์ค์
+CONTAINER_SUFFIX=
+DEBUG=true
+RELOAD=true
+LOG_LEVEL=DEBUG
+
+# ํฌํธ ์ค์ (๊ฐ๋ฐ ํ๊ฒฝ - ๋ชจ๋ ํฌํธ ์ธ๋ถ ๋
ธ์ถ)
+POSTGRES_EXTERNAL_PORT=5432
+REDIS_EXTERNAL_PORT=6379
+BACKEND_EXTERNAL_PORT=18000
+FRONTEND_EXTERNAL_PORT=13000
+PGADMIN_EXTERNAL_PORT=5050
+
+# ๋ณผ๋ฅจ ์ค์
+POSTGRES_DATA_VOLUME=postgres_data
+REDIS_DATA_VOLUME=redis_data
+PGADMIN_DATA_VOLUME=pgadmin_data
+BACKEND_CODE_VOLUME=./backend
+UPLOADS_VOLUME=./backend/uploads
+
+# ํ๋กํ์ผ ์ค์
+POSTGRES_PROFILE=default
+REDIS_PROFILE=default
+BACKEND_PROFILE=default
+FRONTEND_PROFILE=default
+PGADMIN_PROFILE=dev
+
+# API URL
+VITE_API_URL=http://localhost:18000
+
+# ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ค์
+POSTGRES_DB=tk_mp_bom
+POSTGRES_USER=tkmp_user
+POSTGRES_PASSWORD=tkmp_password_2025
+
+# pgAdmin ์ค์
+PGADMIN_EMAIL=admin@example.com
+PGADMIN_PASSWORD=admin2025
+
diff --git a/env.example b/env.example
new file mode 100644
index 0000000..0f95871
--- /dev/null
+++ b/env.example
@@ -0,0 +1,36 @@
+# TK-MP-Project ํ๊ฒฝ๋ณ์ ์ค์ ์์
+# ์ค์ ์ฌ์ฉ ์ .env ํ์ผ๋ก ๋ณต์ฌํ์ฌ ๊ฐ์ ์์ ํ์ธ์
+
+# ํ๊ฒฝ ์ค์
+ENVIRONMENT=development
+DEBUG=true
+
+# ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ค์
+POSTGRES_DB=tk_mp_bom
+POSTGRES_USER=tkmp_user
+POSTGRES_PASSWORD=tkmp_password_2025
+POSTGRES_PORT=5432
+
+# Redis ์ค์
+REDIS_PORT=6379
+
+# ์ ํ๋ฆฌ์ผ์ด์
ํฌํธ ์ค์
+BACKEND_PORT=18000
+FRONTEND_PORT=13000
+
+# API URL ์ค์
+VITE_API_URL=http://localhost:18000
+
+# pgAdmin ์ค์
+PGADMIN_EMAIL=admin@example.com
+PGADMIN_PASSWORD=admin2025
+PGADMIN_PORT=5050
+
+# JWT ์ค์ (ํ๋ก๋์
์์๋ ๋ฐ๋์ ๋ณ๊ฒฝ)
+JWT_SECRET_KEY=your-super-secure-secret-key-here
+
+# ๋ก๊น
์ค์
+LOG_LEVEL=DEBUG
+
+# ๋ณด์ ์ค์
+CORS_ORIGINS=http://localhost:3000,http://localhost:13000,http://localhost:5173
diff --git a/env.production b/env.production
new file mode 100644
index 0000000..7b93137
--- /dev/null
+++ b/env.production
@@ -0,0 +1,43 @@
+# ํ๋ก๋์
ํ๊ฒฝ ์ค์
+DEPLOY_ENV=production
+COMPOSE_PROJECT_NAME=tk-mp-prod
+
+# ์ปจํ
์ด๋ ์ค์
+CONTAINER_SUFFIX=-prod
+DEBUG=false
+RELOAD=false
+LOG_LEVEL=INFO
+
+# ํฌํธ ์ค์ (ํ๋ก๋์
ํ๊ฒฝ - ๋ด๋ถ ์๋น์ค๋ ํฌํธ ๋น๋
ธ์ถ)
+POSTGRES_EXTERNAL_PORT=
+REDIS_EXTERNAL_PORT=
+BACKEND_EXTERNAL_PORT=
+FRONTEND_EXTERNAL_PORT=
+PGADMIN_EXTERNAL_PORT=
+
+# Nginx ํฌํธ (ํ๋ก๋์
์์๋ง ์ฌ์ฉ)
+NGINX_HTTP_PORT=80
+NGINX_HTTPS_PORT=443
+
+# ๋ณผ๋ฅจ ์ค์
+POSTGRES_DATA_VOLUME=postgres_data
+REDIS_DATA_VOLUME=redis_data
+PGADMIN_DATA_VOLUME=pgadmin_data
+BACKEND_CODE_VOLUME=
+UPLOADS_VOLUME=./backend/uploads
+
+# ํ๋กํ์ผ ์ค์ (pgAdmin ๋นํ์ฑํ)
+POSTGRES_PROFILE=default
+REDIS_PROFILE=default
+BACKEND_PROFILE=default
+FRONTEND_PROFILE=default
+PGADMIN_PROFILE=disabled
+
+# API URL (ํ๋ก๋์
์์๋ ์๋ ๊ฒฝ๋ก)
+VITE_API_URL=/api
+
+# ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ค์
+POSTGRES_DB=tk_mp_bom
+POSTGRES_USER=tkmp_user
+POSTGRES_PASSWORD=tkmp_password_2025
+
diff --git a/env.synology b/env.synology
new file mode 100644
index 0000000..1b1bc4b
--- /dev/null
+++ b/env.synology
@@ -0,0 +1,43 @@
+# ์๋๋ก์ง NAS ํ๊ฒฝ ์ค์
+DEPLOY_ENV=synology
+COMPOSE_PROJECT_NAME=tk-mp-synology
+
+# ์ปจํ
์ด๋ ์ค์
+CONTAINER_SUFFIX=-synology
+DEBUG=false
+RELOAD=false
+LOG_LEVEL=INFO
+
+# ํฌํธ ์ค์ (์๋๋ก์ง ํ๊ฒฝ - ํฌํธ ์ถฉ๋ ๋ฐฉ์ง)
+POSTGRES_EXTERNAL_PORT=15432
+REDIS_EXTERNAL_PORT=16379
+BACKEND_EXTERNAL_PORT=10080
+FRONTEND_EXTERNAL_PORT=10173
+PGADMIN_EXTERNAL_PORT=15050
+
+# ๋ณผ๋ฅจ ์ค์ (์๋๋ก์ง์ฉ ๋ช
๋ช
๋ ๋ณผ๋ฅจ)
+POSTGRES_DATA_VOLUME=tk_mp_postgres_data
+REDIS_DATA_VOLUME=tk_mp_redis_data
+PGLADMIN_DATA_VOLUME=pgadmin_data
+BACKEND_CODE_VOLUME=
+UPLOADS_VOLUME=tk_mp_uploads
+
+# ํ๋กํ์ผ ์ค์
+POSTGRES_PROFILE=default
+REDIS_PROFILE=default
+BACKEND_PROFILE=default
+FRONTEND_PROFILE=default
+PGADMIN_PROFILE=dev
+
+# API URL (์๋๋ก์ง ํ๊ฒฝ)
+VITE_API_URL=http://localhost:10080
+
+# ๋ฐ์ดํฐ๋ฒ ์ด์ค ์ค์
+POSTGRES_DB=tk_mp_bom
+POSTGRES_USER=tkmp_user
+POSTGRES_PASSWORD=tkmp_password_2025
+
+# pgAdmin ์ค์
+PGADMIN_EMAIL=admin@example.com
+PGLADMIN_PASSWORD=admin2025
+
diff --git a/frontend/nginx.conf b/frontend/nginx.conf
index 55fa86c..57086ec 100644
--- a/frontend/nginx.conf
+++ b/frontend/nginx.conf
@@ -3,6 +3,9 @@ server {
server_name localhost;
root /usr/share/nginx/html;
index index.html index.htm;
+
+ # ๐ง ์์ฒญ ํฌ๊ธฐ ์ ํ ์ฆ๊ฐ (413 ์ค๋ฅ ํด๊ฒฐ)
+ client_max_body_size 100M;
# SPA๋ฅผ ์ํ ์ค์ (React Router ๋ฑ)
location / {
@@ -16,6 +19,10 @@ server {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
+
+ # ํ๋ก์ ์์ฒญ ํฌ๊ธฐ ์ ํ ์ฆ๊ฐ
+ proxy_request_buffering off;
+ client_max_body_size 100M;
}
# ์ ์ ํ์ผ ์บ์ฑ
diff --git a/frontend/public/img/login-bg.jpeg b/frontend/public/img/login-bg.jpeg
new file mode 100644
index 0000000..1681fd4
Binary files /dev/null and b/frontend/public/img/login-bg.jpeg differ
diff --git a/frontend/public/img/logo.png b/frontend/public/img/logo.png
new file mode 100644
index 0000000..5bbd962
Binary files /dev/null and b/frontend/public/img/logo.png differ
diff --git a/frontend/src/App.jsx b/frontend/src/App.jsx
index 4dab90c..61875d4 100644
--- a/frontend/src/App.jsx
+++ b/frontend/src/App.jsx
@@ -1,13 +1,8 @@
import React, { useState, useEffect } from 'react';
import SimpleLogin from './SimpleLogin';
-import NavigationMenu from './components/NavigationMenu';
-import DashboardPage from './pages/DashboardPage';
-import ProjectsPage from './pages/ProjectsPage';
-import BOMStatusPage from './pages/BOMStatusPage';
-import SimpleMaterialsPage from './pages/SimpleMaterialsPage';
-import MaterialComparisonPage from './pages/MaterialComparisonPage';
-import RevisionPurchasePage from './pages/RevisionPurchasePage';
-import JobSelectionPage from './pages/JobSelectionPage';
+import BOMWorkspacePage from './pages/BOMWorkspacePage';
+import NewMaterialsPage from './pages/NewMaterialsPage';
+import SystemSettingsPage from './pages/SystemSettingsPage';
import './App.css';
function App() {
@@ -16,6 +11,7 @@ function App() {
const [user, setUser] = useState(null);
const [currentPage, setCurrentPage] = useState('dashboard');
const [pageParams, setPageParams] = useState({});
+ const [selectedProject, setSelectedProject] = useState(null);
useEffect(() => {
// ์ ์ฅ๋ ํ ํฐ ํ์ธ
@@ -28,6 +24,24 @@ function App() {
}
setIsLoading(false);
+
+ // ์์ฌ ๋ชฉ๋ก ํ์ด์ง๋ก ์ด๋ ์ด๋ฒคํธ ๋ฆฌ์ค๋
+ const handleNavigateToMaterials = (event) => {
+ const { jobNo, revision, bomName, message, file_id } = event.detail;
+ navigateToPage('materials', {
+ jobNo: jobNo,
+ revision: revision,
+ bomName: bomName,
+ message: message,
+ file_id: file_id // file_id ์ถ๊ฐ
+ });
+ };
+
+ window.addEventListener('navigateToMaterials', handleNavigateToMaterials);
+
+ return () => {
+ window.removeEventListener('navigateToMaterials', handleNavigateToMaterials);
+ };
}, []);
// ๋ก๊ทธ์ธ ์ฑ๊ณต ์ ํธ์ถ๋ ํจ์
@@ -54,152 +68,393 @@ function App() {
setPageParams(params);
};
+ // ํต์ฌ ๊ธฐ๋ฅ๋ง ์ ๊ณต
+ const getCoreFeatures = () => {
+ return [
+ {
+ id: 'bom',
+ title: '๐ BOM ์
๋ก๋ & ๋ถ๋ฅ',
+ description: '์์
ํ์ผ ์
๋ก๋ โ ์๋ ๋ถ๋ฅ โ ๊ฒํ โ ์์ฌ ํ์ธ โ ์์
๋ด๋ณด๋ด๊ธฐ',
+ color: '#4299e1'
+ }
+ ];
+ };
+
+ // ๊ด๋ฆฌ์ ์ ์ฉ ๊ธฐ๋ฅ
+ const getAdminFeatures = () => {
+ if (user?.role !== 'admin') return [];
+
+ return [
+ {
+ id: 'system-settings',
+ title: 'โ๏ธ ์์คํ
์ค์ ',
+ description: '์ฌ์ฉ์ ๊ณ์ ๊ด๋ฆฌ',
+ color: '#dc2626'
+ }
+ ];
+ };
+
// ํ์ด์ง ๋ ๋๋ง ํจ์
const renderCurrentPage = () => {
+ console.log('ํ์ฌ ํ์ด์ง:', currentPage, 'ํ์ด์ง ํ๋ผ๋ฏธํฐ:', pageParams);
switch (currentPage) {
case 'dashboard':
- return
+ {user?.full_name || user?.username}๋ ํ์ํฉ๋๋ค +
++ {feature.description} +
+ ++ {feature.description} +
+- ํตํฉ ํ๋ก์ ํธ ๊ด๋ฆฌ ์์คํ -
-
+ - ํ ์คํธ ๊ณ์ : admin / admin123 ๋๋ testuser / test123 +
+ ํ ์คํธ ๊ณ์ : admin / admin123
์์คํ ์ด ์ ์์ ์ผ๋ก ์๋ ์ค์ ๋๋ค!
-+ {project?.official_project_code || project?.job_no} +
+์์ฌ ๋ชฉ๋ก์ ๋ถ๋ฌ์ค๋ ์ค...
++ {action.description} +
+| No. | -์์ฌ๋ช | -๊ท๊ฒฉ | -์๋ | -๋จ์ | -์นดํ ๊ณ ๋ฆฌ | -์ฌ์ง | -์ ๋ขฐ๋ | -์์ธ์ ๋ณด | -
|---|---|---|---|---|---|---|---|---|
| - {material.line_number || index + 1} - | -- {material.original_description || '-'} - | -- {material.size_spec || material.main_nom || '-'} - | -- {material.quantity || '-'} - | -- {material.unit || '-'} - | -- - {material.classified_category || 'unknown'} - - | -- {material.material_grade || '-'} - | -- {getConfidenceBadge(material.classification_confidence)} - | -- {getDetailInfo(material)} - | -
| ์นดํ ๊ณ ๋ฆฌ | -์ฌ์ | -์ฌ์ด์ฆ | -์ฌ์ง | -BOM ์๋ | -๊ตฌ๋งค ์๋ | -๋จ์ | -๋น๊ณ | -
|---|---|---|---|---|---|---|---|
| - - {item.category} - - | -- {item.specification} - | -- {/* PIPE๋ ์ฌ์์ ๋ชจ๋ ์ ๋ณด๊ฐ ํฌํจ๋๋ฏ๋ก ์ฌ์ด์ฆ ์ปฌ๋ผ ๋น์ */} - {item.category !== 'PIPE' && ( - - {item.size_spec || '-'} - - )} - {item.category === 'PIPE' && ( - - ์ฌ์์ ํฌํจ - - )} - | -- {/* PIPE๋ ์ฌ์์ ๋ชจ๋ ์ ๋ณด๊ฐ ํฌํจ๋๋ฏ๋ก ์ฌ์ง ์ปฌ๋ผ ๋น์ */} - {item.category !== 'PIPE' && ( - - {item.material_spec || '-'} - - )} - {item.category === 'PIPE' && ( - - ์ฌ์์ ํฌํจ - - )} - | -- {item.category === 'PIPE' ? - `${Math.round(item.bom_quantity)}mm` : - item.bom_quantity - } - | -- {item.category === 'PIPE' ? - `${item.pipes_count}๋ณธ (${Math.round(item.calculated_qty)}mm)` : - item.calculated_qty - } - | -- {item.unit} - | -
- {item.category === 'PIPE' && (
-
-
- )}
- {item.category !== 'PIPE' && item.safety_factor && (
- ์ ๋จ์: {item.cutting_count}ํ
- ์ ๋จ์์ค: {item.cutting_loss}mm
- ํ์ฉ๋ฅ : {Math.round(item.utilization_rate)}%
- ์ฌ์ ์จ: {Math.round((item.safety_factor - 1) * 100)}%
- )}
- |
-
+ ์์คํ ์ค์ ์ ๊ด๋ฆฌ์๋ง ์ ๊ทผํ ์ ์์ต๋๋ค. +
++ ์ฌ์ฉ์ ๊ณ์ ๊ด๋ฆฌ ๋ฐ ์์คํ ์ค์ +
+| + ์ฌ์ฉ์๋ช + | ++ ์ด๋ฉ์ผ + | ++ ์ ์ฒด ์ด๋ฆ + | ++ ๊ถํ + | ++ ์ํ + | ++ ์์ + | +
|---|---|---|---|---|---|
| + {userItem.username} + | ++ {userItem.email} + | ++ {userItem.full_name || '-'} + | ++ + {getRoleDisplay(userItem.role)} + + | ++ + {userItem.is_active ? 'ํ์ฑ' : '๋นํ์ฑ'} + + | +
+ {userItem.id !== user?.id && (
+ |
+