Datasets:
Upload 16 files
Browse files- .gitattributes +1 -0
- HVU_QA/HVU.png +3 -0
- HVU_QA/HVU_QA_end_to_end_guide.ipynb +643 -0
- HVU_QA/HVU_QA_tool.bat +43 -0
- HVU_QA/HVU_QA_tool.py +2003 -0
- HVU_QA/backend/__init__.py +3 -0
- HVU_QA/backend/__pycache__/__init__.cpython-311.pyc +0 -0
- HVU_QA/backend/__pycache__/app.cpython-311.pyc +0 -0
- HVU_QA/backend/app.py +319 -0
- HVU_QA/fine_tune_qg.py +556 -0
- HVU_QA/frontend/app.js +1233 -0
- HVU_QA/frontend/index.html +265 -0
- HVU_QA/frontend/style.css +1792 -0
- HVU_QA/generate_question.py +383 -0
- HVU_QA/main.py +31 -0
- HVU_QA/readme.md +392 -0
- HVU_QA/requirements.txt +11 -0
.gitattributes
CHANGED
|
@@ -479,3 +479,4 @@ HVU_QA/40k_train.json filter=lfs diff=lfs merge=lfs -text
|
|
| 479 |
40k_train.json filter=lfs diff=lfs merge=lfs -text
|
| 480 |
HVU_QA/t5-viet-qg-finetuned/best-model/model.safetensors filter=lfs diff=lfs merge=lfs -text
|
| 481 |
HVU_QA/t5-viet-qg-finetuned/best-model/spiece.model filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
| 479 |
40k_train.json filter=lfs diff=lfs merge=lfs -text
|
| 480 |
HVU_QA/t5-viet-qg-finetuned/best-model/model.safetensors filter=lfs diff=lfs merge=lfs -text
|
| 481 |
HVU_QA/t5-viet-qg-finetuned/best-model/spiece.model filter=lfs diff=lfs merge=lfs -text
|
| 482 |
+
HVU_QA/HVU.png filter=lfs diff=lfs merge=lfs -text
|
HVU_QA/HVU.png
ADDED
|
Git LFS Details
|
HVU_QA/HVU_QA_end_to_end_guide.ipynb
ADDED
|
@@ -0,0 +1,643 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"cells": [
|
| 3 |
+
{
|
| 4 |
+
"cell_type": "markdown",
|
| 5 |
+
"metadata": {},
|
| 6 |
+
"source": [
|
| 7 |
+
"# HVU_QA - Notebook h??ng d?n s? d?ng\n",
|
| 8 |
+
"\n",
|
| 9 |
+
"Notebook n?y ???c t?ch th?nh **2 lu?ng r? r?ng**:\n",
|
| 10 |
+
"- **Ph?n A - Full project**: d?nh cho ng??i t?i to?n b? source code ?? d?ng v? ph?t tri?n ti?p.\n",
|
| 11 |
+
"- **Ph?n B - Ch?y nhanh b?ng tool**: d?nh cho ng??i ch? d?ng `HVU_QA_tool.py` ho?c `HVU_QA_tool.bat` ?? d?ng runtime v? ch?y m? h?nh sinh c?u h?i.\n",
|
| 12 |
+
"\n",
|
| 13 |
+
"Notebook ???c vi?t ?? ch?y t? th? m?c g?c c?a repo `HVU_QA`.\n"
|
| 14 |
+
]
|
| 15 |
+
},
|
| 16 |
+
{
|
| 17 |
+
"cell_type": "markdown",
|
| 18 |
+
"metadata": {},
|
| 19 |
+
"source": [
|
| 20 |
+
"## 0. Chu?n b? helper d?ng chung\n",
|
| 21 |
+
"\n",
|
| 22 |
+
"Cell d??i ??y s?:\n",
|
| 23 |
+
"- t?m th? m?c g?c project\n",
|
| 24 |
+
"- chu?n h?a ???ng d?n `venv`\n",
|
| 25 |
+
"- cung c?p h?m ch?y l?nh shell t? notebook\n",
|
| 26 |
+
"- cung c?p h?m ch? server ph?n h?i ?n ??nh\n",
|
| 27 |
+
"- cung c?p th? m?c demo cho ph?n `Quick tool`\n"
|
| 28 |
+
]
|
| 29 |
+
},
|
| 30 |
+
{
|
| 31 |
+
"cell_type": "code",
|
| 32 |
+
"metadata": {},
|
| 33 |
+
"execution_count": null,
|
| 34 |
+
"outputs": [],
|
| 35 |
+
"source": [
|
| 36 |
+
"from __future__ import annotations\n",
|
| 37 |
+
"\n",
|
| 38 |
+
"import json\n",
|
| 39 |
+
"import os\n",
|
| 40 |
+
"import platform\n",
|
| 41 |
+
"import shutil\n",
|
| 42 |
+
"import subprocess\n",
|
| 43 |
+
"import sys\n",
|
| 44 |
+
"import time\n",
|
| 45 |
+
"import urllib.request\n",
|
| 46 |
+
"from pathlib import Path\n",
|
| 47 |
+
"\n",
|
| 48 |
+
"\n",
|
| 49 |
+
"def find_project_root(start: Path) -> Path:\n",
|
| 50 |
+
" current = start.resolve()\n",
|
| 51 |
+
" while True:\n",
|
| 52 |
+
" markers = [\n",
|
| 53 |
+
" current / 'main.py',\n",
|
| 54 |
+
" current / 'requirements.txt',\n",
|
| 55 |
+
" current / 'backend' / 'app.py',\n",
|
| 56 |
+
" current / 'frontend' / 'index.html',\n",
|
| 57 |
+
" current / 'HVU_QA_tool.py',\n",
|
| 58 |
+
" ]\n",
|
| 59 |
+
" if all(marker.exists() for marker in markers):\n",
|
| 60 |
+
" return current\n",
|
| 61 |
+
" if current.parent == current:\n",
|
| 62 |
+
" raise FileNotFoundError('Kh?ng t?m th?y th? m?c g?c c?a project HVU_QA t? notebook hi?n t?i.')\n",
|
| 63 |
+
" current = current.parent\n",
|
| 64 |
+
"\n",
|
| 65 |
+
"\n",
|
| 66 |
+
"PROJECT_ROOT = find_project_root(Path.cwd())\n",
|
| 67 |
+
"os.chdir(PROJECT_ROOT)\n",
|
| 68 |
+
"\n",
|
| 69 |
+
"IS_WINDOWS = platform.system().lower().startswith('win')\n",
|
| 70 |
+
"VENV_DIR = PROJECT_ROOT / 'venv'\n",
|
| 71 |
+
"VENV_PYTHON = VENV_DIR / ('Scripts/python.exe' if IS_WINDOWS else 'bin/python')\n",
|
| 72 |
+
"WEB_LOG_FILE = PROJECT_ROOT / 'hvu_qa_web.log'\n",
|
| 73 |
+
"QUICK_TOOL_DIR = PROJECT_ROOT / '_notebook_quick_tool'\n",
|
| 74 |
+
"QUICK_TOOL_RUNTIME = QUICK_TOOL_DIR / 'HVU_QA_runtime'\n",
|
| 75 |
+
"\n",
|
| 76 |
+
"\n",
|
| 77 |
+
"def print_title(title: str) -> None:\n",
|
| 78 |
+
" print(f'\\n=== {title} ===')\n",
|
| 79 |
+
"\n",
|
| 80 |
+
"\n",
|
| 81 |
+
"def run_command(command: list[str], *, cwd: Path | None = None, env: dict[str, str] | None = None, check: bool = True):\n",
|
| 82 |
+
" print_title('Ch?y l?nh')\n",
|
| 83 |
+
" print(' '.join(command))\n",
|
| 84 |
+
" result = subprocess.run(\n",
|
| 85 |
+
" command,\n",
|
| 86 |
+
" cwd=str(cwd or PROJECT_ROOT),\n",
|
| 87 |
+
" env=env,\n",
|
| 88 |
+
" text=True,\n",
|
| 89 |
+
" encoding='utf-8',\n",
|
| 90 |
+
" capture_output=True,\n",
|
| 91 |
+
" )\n",
|
| 92 |
+
" if result.stdout:\n",
|
| 93 |
+
" print(result.stdout)\n",
|
| 94 |
+
" if result.stderr:\n",
|
| 95 |
+
" print(result.stderr)\n",
|
| 96 |
+
" if check and result.returncode != 0:\n",
|
| 97 |
+
" raise RuntimeError(f'L?nh th?t b?i v?i m? l?i {result.returncode}')\n",
|
| 98 |
+
" return result\n",
|
| 99 |
+
"\n",
|
| 100 |
+
"\n",
|
| 101 |
+
"def wait_for_json(url: str, timeout: int = 45):\n",
|
| 102 |
+
" deadline = time.time() + timeout\n",
|
| 103 |
+
" last_error = None\n",
|
| 104 |
+
" while time.time() < deadline:\n",
|
| 105 |
+
" try:\n",
|
| 106 |
+
" with urllib.request.urlopen(url, timeout=3) as response:\n",
|
| 107 |
+
" return json.loads(response.read().decode('utf-8'))\n",
|
| 108 |
+
" except Exception as exc: # noqa: BLE001\n",
|
| 109 |
+
" last_error = exc\n",
|
| 110 |
+
" time.sleep(1)\n",
|
| 111 |
+
" raise RuntimeError(f'Kh?ng nh?n ???c ph?n h?i JSON t? {url} sau {timeout} gi?y. L?i cu?i: {last_error}')\n",
|
| 112 |
+
"\n",
|
| 113 |
+
"\n",
|
| 114 |
+
"def read_log_tail(path: Path, lines: int = 40) -> str:\n",
|
| 115 |
+
" if not path.exists():\n",
|
| 116 |
+
" return '(Ch?a c? log server.)'\n",
|
| 117 |
+
" content = path.read_text(encoding='utf-8', errors='ignore').splitlines()\n",
|
| 118 |
+
" if not content:\n",
|
| 119 |
+
" return '(Log server ?ang tr?ng.)'\n",
|
| 120 |
+
" return '\\n'.join(content[-lines:])\n",
|
| 121 |
+
"\n",
|
| 122 |
+
"\n",
|
| 123 |
+
"print_title('Th?ng tin m?i tr??ng')\n",
|
| 124 |
+
"print('PROJECT_ROOT =', PROJECT_ROOT)\n",
|
| 125 |
+
"print('Notebook Python =', sys.executable)\n",
|
| 126 |
+
"print('VENV_PYTHON =', VENV_PYTHON)\n",
|
| 127 |
+
"print('IS_WINDOWS =', IS_WINDOWS)\n",
|
| 128 |
+
"print('WEB_LOG_FILE =', WEB_LOG_FILE)\n",
|
| 129 |
+
"print('QUICK_TOOL_DIR =', QUICK_TOOL_DIR)\n"
|
| 130 |
+
]
|
| 131 |
+
},
|
| 132 |
+
{
|
| 133 |
+
"cell_type": "markdown",
|
| 134 |
+
"metadata": {},
|
| 135 |
+
"source": [
|
| 136 |
+
"# Ph?n A - Full project\n",
|
| 137 |
+
"\n",
|
| 138 |
+
"Ph?n n?y d?nh cho tr??ng h?p b?n ?? t?i **to?n b? project `HVU_QA`** v? mu?n d?ng ho?c ph?t tri?n ti?p.\n"
|
| 139 |
+
]
|
| 140 |
+
},
|
| 141 |
+
{
|
| 142 |
+
"cell_type": "markdown",
|
| 143 |
+
"metadata": {},
|
| 144 |
+
"source": [
|
| 145 |
+
"## A1. Ki?m tra c?u tr?c project\n",
|
| 146 |
+
"\n",
|
| 147 |
+
"Cell n?y x?c nh?n nhanh c?c file quan tr?ng ?? c? trong repo.\n"
|
| 148 |
+
]
|
| 149 |
+
},
|
| 150 |
+
{
|
| 151 |
+
"cell_type": "code",
|
| 152 |
+
"metadata": {},
|
| 153 |
+
"execution_count": null,
|
| 154 |
+
"outputs": [],
|
| 155 |
+
"source": [
|
| 156 |
+
"important_paths = [\n",
|
| 157 |
+
" 'main.py',\n",
|
| 158 |
+
" 'HVU_QA_tool.py',\n",
|
| 159 |
+
" 'requirements.txt',\n",
|
| 160 |
+
" 'backend/app.py',\n",
|
| 161 |
+
" 'backend/__init__.py',\n",
|
| 162 |
+
" 'frontend/index.html',\n",
|
| 163 |
+
" 'frontend/app.js',\n",
|
| 164 |
+
" 'frontend/style.css',\n",
|
| 165 |
+
" 'generate_question.py',\n",
|
| 166 |
+
" 'fine_tune_qg.py',\n",
|
| 167 |
+
"]\n",
|
| 168 |
+
"\n",
|
| 169 |
+
"print_title('Ki?m tra file quan tr?ng')\n",
|
| 170 |
+
"for item in important_paths:\n",
|
| 171 |
+
" path = PROJECT_ROOT / item\n",
|
| 172 |
+
" print(f'{item:30} ->', 'OK' if path.exists() else 'THI?U')\n"
|
| 173 |
+
]
|
| 174 |
+
},
|
| 175 |
+
{
|
| 176 |
+
"cell_type": "markdown",
|
| 177 |
+
"metadata": {},
|
| 178 |
+
"source": [
|
| 179 |
+
"## A2. T?o m?i tr??ng ?o `venv`\n",
|
| 180 |
+
"\n",
|
| 181 |
+
"Cell n?y s? t?o `venv` n?u ch?a c?.\n"
|
| 182 |
+
]
|
| 183 |
+
},
|
| 184 |
+
{
|
| 185 |
+
"cell_type": "code",
|
| 186 |
+
"metadata": {},
|
| 187 |
+
"execution_count": null,
|
| 188 |
+
"outputs": [],
|
| 189 |
+
"source": [
|
| 190 |
+
"if VENV_PYTHON.exists():\n",
|
| 191 |
+
" print('venv ?? t?n t?i:', VENV_DIR)\n",
|
| 192 |
+
"else:\n",
|
| 193 |
+
" run_command([sys.executable, '-m', 'venv', str(VENV_DIR)])\n",
|
| 194 |
+
" print('?? t?o xong venv t?i:', VENV_DIR)\n"
|
| 195 |
+
]
|
| 196 |
+
},
|
| 197 |
+
{
|
| 198 |
+
"cell_type": "markdown",
|
| 199 |
+
"metadata": {},
|
| 200 |
+
"source": [
|
| 201 |
+
"## A3. C?i dependencies t? `requirements.txt`\n",
|
| 202 |
+
"\n",
|
| 203 |
+
"Cell n?y d?ng cho **full project**. N?u b?n ch? d?ng launcher m?t file th? sang **Ph?n B**.\n"
|
| 204 |
+
]
|
| 205 |
+
},
|
| 206 |
+
{
|
| 207 |
+
"cell_type": "code",
|
| 208 |
+
"metadata": {},
|
| 209 |
+
"execution_count": null,
|
| 210 |
+
"outputs": [],
|
| 211 |
+
"source": [
|
| 212 |
+
"run_command([str(VENV_PYTHON), '-m', 'pip', 'install', '--upgrade', 'pip'])\n",
|
| 213 |
+
"run_command([str(VENV_PYTHON), '-m', 'pip', 'install', '-r', str(PROJECT_ROOT / 'requirements.txt')])\n"
|
| 214 |
+
]
|
| 215 |
+
},
|
| 216 |
+
{
|
| 217 |
+
"cell_type": "markdown",
|
| 218 |
+
"metadata": {},
|
| 219 |
+
"source": [
|
| 220 |
+
"## A4. T?i ho?c ??ng b? model b?ng `HVU_QA_tool.py`\n",
|
| 221 |
+
"\n",
|
| 222 |
+
"Notebook g?i tool ? **ch? ?? full project** ?? ??ng b? model n?u c?n.\n",
|
| 223 |
+
"\n",
|
| 224 |
+
"- `BEST_MODEL_ONLY = False`: t?i model g?c theo repo hi?n t?i.\n",
|
| 225 |
+
"- `BEST_MODEL_ONLY = True`: ch? d?ng khi repo tr?n Hugging Face th?t s? c? th? m?c `best-model`.\n"
|
| 226 |
+
]
|
| 227 |
+
},
|
| 228 |
+
{
|
| 229 |
+
"cell_type": "code",
|
| 230 |
+
"metadata": {},
|
| 231 |
+
"execution_count": null,
|
| 232 |
+
"outputs": [],
|
| 233 |
+
"source": [
|
| 234 |
+
"BEST_MODEL_ONLY = False\n",
|
| 235 |
+
"full_project_tool_command = [str(VENV_PYTHON), str(PROJECT_ROOT / 'HVU_QA_tool.py'), '--skip-run']\n",
|
| 236 |
+
"if BEST_MODEL_ONLY:\n",
|
| 237 |
+
" full_project_tool_command.append('--best-model-only')\n",
|
| 238 |
+
"run_command(full_project_tool_command)\n"
|
| 239 |
+
]
|
| 240 |
+
},
|
| 241 |
+
{
|
| 242 |
+
"cell_type": "markdown",
|
| 243 |
+
"metadata": {},
|
| 244 |
+
"source": [
|
| 245 |
+
"## A5. Ki?m tra model local sau khi t?i\n",
|
| 246 |
+
"\n",
|
| 247 |
+
"N?u ? b??c tr?n d?ng `BEST_MODEL_ONLY = True`, notebook s? ch? b?t bu?c ki?m tra `best-model`.\n"
|
| 248 |
+
]
|
| 249 |
+
},
|
| 250 |
+
{
|
| 251 |
+
"cell_type": "code",
|
| 252 |
+
"metadata": {},
|
| 253 |
+
"execution_count": null,
|
| 254 |
+
"outputs": [],
|
| 255 |
+
"source": [
|
| 256 |
+
"model_root = PROJECT_ROOT / 't5-viet-qg-finetuned'\n",
|
| 257 |
+
"best_model_only = bool(globals().get('BEST_MODEL_ONLY', False))\n",
|
| 258 |
+
"\n",
|
| 259 |
+
"root_required_files = [\n",
|
| 260 |
+
" model_root / 'config.json',\n",
|
| 261 |
+
" model_root / 'generation_config.json',\n",
|
| 262 |
+
" model_root / 'model.safetensors',\n",
|
| 263 |
+
" model_root / 'tokenizer_config.json',\n",
|
| 264 |
+
" model_root / 'special_tokens_map.json',\n",
|
| 265 |
+
" model_root / 'spiece.model',\n",
|
| 266 |
+
"]\n",
|
| 267 |
+
"\n",
|
| 268 |
+
"best_required_files = [\n",
|
| 269 |
+
" model_root / 'best-model' / 'config.json',\n",
|
| 270 |
+
" model_root / 'best-model' / 'generation_config.json',\n",
|
| 271 |
+
" model_root / 'best-model' / 'model.safetensors',\n",
|
| 272 |
+
" model_root / 'best-model' / 'tokenizer_config.json',\n",
|
| 273 |
+
" model_root / 'best-model' / 'special_tokens_map.json',\n",
|
| 274 |
+
" model_root / 'best-model' / 'spiece.model',\n",
|
| 275 |
+
"]\n",
|
| 276 |
+
"\n",
|
| 277 |
+
"print_title('Ki?m tra model local')\n",
|
| 278 |
+
"required_sets = [('best-model', best_required_files)] if best_model_only else [\n",
|
| 279 |
+
" ('model g?c', root_required_files),\n",
|
| 280 |
+
" ('best-model', best_required_files),\n",
|
| 281 |
+
"]\n",
|
| 282 |
+
"\n",
|
| 283 |
+
"for label, files in required_sets:\n",
|
| 284 |
+
" print(f'\\n{label}:')\n",
|
| 285 |
+
" for item in files:\n",
|
| 286 |
+
" print(item.relative_to(PROJECT_ROOT), '->', 'OK' if item.exists() else 'THI?U')\n"
|
| 287 |
+
]
|
| 288 |
+
},
|
| 289 |
+
{
|
| 290 |
+
"cell_type": "markdown",
|
| 291 |
+
"metadata": {},
|
| 292 |
+
"source": [
|
| 293 |
+
"## A6. Ch?y web app b?ng `main.py`\n",
|
| 294 |
+
"\n",
|
| 295 |
+
"Cell n?y s? ch?y Flask server ? background, **kh?ng t? m? tr?nh duy?t**, v? ghi log v?o `hvu_qa_web.log`.\n"
|
| 296 |
+
]
|
| 297 |
+
},
|
| 298 |
+
{
|
| 299 |
+
"cell_type": "code",
|
| 300 |
+
"metadata": {},
|
| 301 |
+
"execution_count": null,
|
| 302 |
+
"outputs": [],
|
| 303 |
+
"source": [
|
| 304 |
+
"WEB_HOST = '127.0.0.1'\n",
|
| 305 |
+
"WEB_PORT = '5000'\n",
|
| 306 |
+
"base_url = f'http://{WEB_HOST}:{WEB_PORT}'\n",
|
| 307 |
+
"\n",
|
| 308 |
+
"if 'web_process' in globals() and web_process and web_process.poll() is None:\n",
|
| 309 |
+
" print('Web app ?ang ch?y r?i. PID =', web_process.pid)\n",
|
| 310 |
+
" print('URL =', base_url)\n",
|
| 311 |
+
"else:\n",
|
| 312 |
+
" web_env = os.environ.copy()\n",
|
| 313 |
+
" web_env['HVU_HOST'] = WEB_HOST\n",
|
| 314 |
+
" web_env['HVU_PORT'] = WEB_PORT\n",
|
| 315 |
+
" web_env['HVU_OPEN_BROWSER'] = '0'\n",
|
| 316 |
+
"\n",
|
| 317 |
+
" if WEB_LOG_FILE.exists():\n",
|
| 318 |
+
" WEB_LOG_FILE.unlink()\n",
|
| 319 |
+
"\n",
|
| 320 |
+
" with WEB_LOG_FILE.open('w', encoding='utf-8') as log_stream:\n",
|
| 321 |
+
" web_process = subprocess.Popen(\n",
|
| 322 |
+
" [str(VENV_PYTHON), str(PROJECT_ROOT / 'main.py')],\n",
|
| 323 |
+
" cwd=str(PROJECT_ROOT),\n",
|
| 324 |
+
" env=web_env,\n",
|
| 325 |
+
" stdout=log_stream,\n",
|
| 326 |
+
" stderr=subprocess.STDOUT,\n",
|
| 327 |
+
" )\n",
|
| 328 |
+
"\n",
|
| 329 |
+
" try:\n",
|
| 330 |
+
" info_payload = wait_for_json(base_url + '/api/info', timeout=45)\n",
|
| 331 |
+
" except Exception as exc: # noqa: BLE001\n",
|
| 332 |
+
" return_code = web_process.poll()\n",
|
| 333 |
+
" raise RuntimeError(\n",
|
| 334 |
+
" 'Web app kh?ng kh?i ??ng th?nh c?ng. '\n",
|
| 335 |
+
" f'returncode={return_code}\\n\\nLog g?n nh?t:\\n{read_log_tail(WEB_LOG_FILE, lines=80)}'\n",
|
| 336 |
+
" ) from exc\n",
|
| 337 |
+
"\n",
|
| 338 |
+
" print('?? kh?i ??ng web app. PID =', web_process.pid)\n",
|
| 339 |
+
" print('URL =', base_url)\n",
|
| 340 |
+
" print('Model ?ang ch?n =', info_payload.get('selected_model_id'))\n",
|
| 341 |
+
" print('T?n model hi?n th? =', info_payload.get('model_name'))\n",
|
| 342 |
+
" print('Log server =', WEB_LOG_FILE)\n"
|
| 343 |
+
]
|
| 344 |
+
},
|
| 345 |
+
{
|
| 346 |
+
"cell_type": "markdown",
|
| 347 |
+
"metadata": {},
|
| 348 |
+
"source": [
|
| 349 |
+
"## A7. G?i th? API backend\n",
|
| 350 |
+
"\n",
|
| 351 |
+
"Cell n?y g?i `GET /api/info`, `POST /api/generate`, v? th? `POST /api/model` n?u c? nhi?u h?n m?t model kh? d?ng.\n"
|
| 352 |
+
]
|
| 353 |
+
},
|
| 354 |
+
{
|
| 355 |
+
"cell_type": "code",
|
| 356 |
+
"metadata": {},
|
| 357 |
+
"execution_count": null,
|
| 358 |
+
"outputs": [],
|
| 359 |
+
"source": [
|
| 360 |
+
"def http_get_json(url: str):\n",
|
| 361 |
+
" with urllib.request.urlopen(url) as response:\n",
|
| 362 |
+
" return json.loads(response.read().decode('utf-8'))\n",
|
| 363 |
+
"\n",
|
| 364 |
+
"\n",
|
| 365 |
+
"def http_post_json(url: str, payload: dict):\n",
|
| 366 |
+
" data = json.dumps(payload, ensure_ascii=False).encode('utf-8')\n",
|
| 367 |
+
" request = urllib.request.Request(url, data=data, headers={'Content-Type': 'application/json'})\n",
|
| 368 |
+
" with urllib.request.urlopen(request) as response:\n",
|
| 369 |
+
" return json.loads(response.read().decode('utf-8'))\n",
|
| 370 |
+
"\n",
|
| 371 |
+
"\n",
|
| 372 |
+
"info_payload = http_get_json(base_url + '/api/info')\n",
|
| 373 |
+
"print_title('GET /api/info')\n",
|
| 374 |
+
"print(json.dumps(info_payload, ensure_ascii=False, indent=2))\n",
|
| 375 |
+
"\n",
|
| 376 |
+
"generate_payload = {\n",
|
| 377 |
+
" 'text': 'C? s? gi?o d?c ??i h?c c? nhi?m v? t? ch?c ??o t?o, nghi?n c?u khoa h?c v? ph?c v? c?ng ??ng.',\n",
|
| 378 |
+
" 'num_questions': 3,\n",
|
| 379 |
+
"}\n",
|
| 380 |
+
"generate_result = http_post_json(base_url + '/api/generate', generate_payload)\n",
|
| 381 |
+
"print_title('POST /api/generate')\n",
|
| 382 |
+
"print(json.dumps(generate_result, ensure_ascii=False, indent=2))\n",
|
| 383 |
+
"\n",
|
| 384 |
+
"available_models = info_payload.get('available_models', [])\n",
|
| 385 |
+
"print_title('Danh s?ch model kh? d?ng')\n",
|
| 386 |
+
"print(json.dumps(available_models, ensure_ascii=False, indent=2))\n",
|
| 387 |
+
"\n",
|
| 388 |
+
"if len(available_models) < 2:\n",
|
| 389 |
+
" print('Ch? c? m?t model kh? d?ng n?n b? qua b??c chuy?n model.')\n",
|
| 390 |
+
"else:\n",
|
| 391 |
+
" current_model_id = info_payload.get('selected_model_id')\n",
|
| 392 |
+
" target_model_id = next(item['id'] for item in available_models if item['id'] != current_model_id)\n",
|
| 393 |
+
" switched_payload = http_post_json(base_url + '/api/model', {'model_id': target_model_id})\n",
|
| 394 |
+
" print_title('POST /api/model')\n",
|
| 395 |
+
" print(json.dumps(switched_payload, ensure_ascii=False, indent=2))\n",
|
| 396 |
+
"\n",
|
| 397 |
+
" restored_payload = http_post_json(base_url + '/api/model', {'model_id': current_model_id})\n",
|
| 398 |
+
" print_title('Kh?i ph?c model ban ??u')\n",
|
| 399 |
+
" print(json.dumps(restored_payload, ensure_ascii=False, indent=2))\n"
|
| 400 |
+
]
|
| 401 |
+
},
|
| 402 |
+
{
|
| 403 |
+
"cell_type": "markdown",
|
| 404 |
+
"metadata": {},
|
| 405 |
+
"source": [
|
| 406 |
+
"## A8. Ch?y `generate_question.py` b?ng CLI\n",
|
| 407 |
+
"\n",
|
| 408 |
+
"Cell n?y minh h?a c?ch ch?y CLI tr?c ti?p m? kh?ng c?n m? giao di?n web.\n"
|
| 409 |
+
]
|
| 410 |
+
},
|
| 411 |
+
{
|
| 412 |
+
"cell_type": "code",
|
| 413 |
+
"metadata": {},
|
| 414 |
+
"execution_count": null,
|
| 415 |
+
"outputs": [],
|
| 416 |
+
"source": [
|
| 417 |
+
"cli_text = 'C? s? gi?o d?c ??i h?c th?c hi?n ho?t ??ng ??o t?o, nghi?n c?u khoa h?c v? ph?c v? c?ng ??ng theo quy ??nh c?a ph?p lu?t.'\n",
|
| 418 |
+
"run_command([\n",
|
| 419 |
+
" str(VENV_PYTHON),\n",
|
| 420 |
+
" str(PROJECT_ROOT / 'generate_question.py'),\n",
|
| 421 |
+
" '--text',\n",
|
| 422 |
+
" cli_text,\n",
|
| 423 |
+
" '--num_questions',\n",
|
| 424 |
+
" '3',\n",
|
| 425 |
+
" '--output_format',\n",
|
| 426 |
+
" 'text',\n",
|
| 427 |
+
"])\n"
|
| 428 |
+
]
|
| 429 |
+
},
|
| 430 |
+
{
|
| 431 |
+
"cell_type": "markdown",
|
| 432 |
+
"metadata": {},
|
| 433 |
+
"source": [
|
| 434 |
+
"## A9. Xem l?nh fine-tune m?u\n",
|
| 435 |
+
"\n",
|
| 436 |
+
"Fine-tune l? t?c v? n?ng, n?n notebook ch? in l?nh m?u ?? b?n copy khi c?n.\n"
|
| 437 |
+
]
|
| 438 |
+
},
|
| 439 |
+
{
|
| 440 |
+
"cell_type": "code",
|
| 441 |
+
"metadata": {},
|
| 442 |
+
"execution_count": null,
|
| 443 |
+
"outputs": [],
|
| 444 |
+
"source": [
|
| 445 |
+
"print_title('Fine-tune tr?n CPU')\n",
|
| 446 |
+
"print(f'{VENV_PYTHON} fine_tune_qg.py --device cpu --output_dir t5-viet-qg-finetuned-cpu')\n",
|
| 447 |
+
"\n",
|
| 448 |
+
"print_title('Fine-tune tr?n GPU')\n",
|
| 449 |
+
"print(\n",
|
| 450 |
+
" f'{VENV_PYTHON} fine_tune_qg.py --device cuda --fp16 --gradient_checkpointing '\n",
|
| 451 |
+
" '--output_dir t5-viet-qg-finetuned'\n",
|
| 452 |
+
")\n"
|
| 453 |
+
]
|
| 454 |
+
},
|
| 455 |
+
{
|
| 456 |
+
"cell_type": "markdown",
|
| 457 |
+
"metadata": {},
|
| 458 |
+
"source": [
|
| 459 |
+
"## A10. D?ng web app\n",
|
| 460 |
+
"\n",
|
| 461 |
+
"Khi kh?ng d?ng n?a, h?y ch?y cell n?y ?? d?ng server ?? b?t ? background.\n"
|
| 462 |
+
]
|
| 463 |
+
},
|
| 464 |
+
{
|
| 465 |
+
"cell_type": "code",
|
| 466 |
+
"metadata": {},
|
| 467 |
+
"execution_count": null,
|
| 468 |
+
"outputs": [],
|
| 469 |
+
"source": [
|
| 470 |
+
"if 'web_process' in globals() and web_process and web_process.poll() is None:\n",
|
| 471 |
+
" web_process.terminate()\n",
|
| 472 |
+
" try:\n",
|
| 473 |
+
" web_process.wait(timeout=5)\n",
|
| 474 |
+
" except subprocess.TimeoutExpired:\n",
|
| 475 |
+
" web_process.kill()\n",
|
| 476 |
+
" web_process.wait(timeout=5)\n",
|
| 477 |
+
" print('?? d?ng web app. PID =', web_process.pid)\n",
|
| 478 |
+
"else:\n",
|
| 479 |
+
" print('Kh?ng c? web app n?o ?ang ch?y.')\n"
|
| 480 |
+
]
|
| 481 |
+
},
|
| 482 |
+
{
|
| 483 |
+
"cell_type": "markdown",
|
| 484 |
+
"metadata": {},
|
| 485 |
+
"source": [
|
| 486 |
+
"# Ph?n B - Ch?y nhanh b?ng tool\n",
|
| 487 |
+
"\n",
|
| 488 |
+
"Ph?n n?y m? ph?ng ??ng tr??ng h?p **ng??i d?ng ch? c? `HVU_QA_tool.py` ho?c `HVU_QA_tool.bat`** trong m?t th? m?c tr?ng.\n",
|
| 489 |
+
"\n",
|
| 490 |
+
"`HVU_QA_tool.py` m?i s?:\n",
|
| 491 |
+
"- t? nh?n bi?t kh?ng c? full project c?nh n?\n",
|
| 492 |
+
"- t? d?ng `HVU_QA_runtime/`\n",
|
| 493 |
+
"- t? t?o virtualenv ri?ng n?u c?n\n",
|
| 494 |
+
"- t? c?i dependencies runtime\n",
|
| 495 |
+
"- t? t?i model t? Hugging Face\n",
|
| 496 |
+
"- t? m? app\n"
|
| 497 |
+
]
|
| 498 |
+
},
|
| 499 |
+
{
|
| 500 |
+
"cell_type": "markdown",
|
| 501 |
+
"metadata": {},
|
| 502 |
+
"source": [
|
| 503 |
+
"## B1. T?o th? m?c demo ch? ch?a tool\n",
|
| 504 |
+
"\n",
|
| 505 |
+
"Cell n?y sao ch?p `HVU_QA_tool.py` v? `HVU_QA_tool.bat` sang m?t th? m?c demo ri?ng.\n"
|
| 506 |
+
]
|
| 507 |
+
},
|
| 508 |
+
{
|
| 509 |
+
"cell_type": "code",
|
| 510 |
+
"metadata": {},
|
| 511 |
+
"execution_count": null,
|
| 512 |
+
"outputs": [],
|
| 513 |
+
"source": [
|
| 514 |
+
"if QUICK_TOOL_DIR.exists():\n",
|
| 515 |
+
" shutil.rmtree(QUICK_TOOL_DIR)\n",
|
| 516 |
+
"QUICK_TOOL_DIR.mkdir(parents=True, exist_ok=True)\n",
|
| 517 |
+
"shutil.copy2(PROJECT_ROOT / 'HVU_QA_tool.py', QUICK_TOOL_DIR / 'HVU_QA_tool.py')\n",
|
| 518 |
+
"shutil.copy2(PROJECT_ROOT / 'HVU_QA_tool.bat', QUICK_TOOL_DIR / 'HVU_QA_tool.bat')\n",
|
| 519 |
+
"\n",
|
| 520 |
+
"print_title('Th? m?c quick tool')\n",
|
| 521 |
+
"for item in QUICK_TOOL_DIR.iterdir():\n",
|
| 522 |
+
" print(item.name)\n"
|
| 523 |
+
]
|
| 524 |
+
},
|
| 525 |
+
{
|
| 526 |
+
"cell_type": "markdown",
|
| 527 |
+
"metadata": {},
|
| 528 |
+
"source": [
|
| 529 |
+
"## B2. D?ng runtime standalone t? m?i file tool\n",
|
| 530 |
+
"\n",
|
| 531 |
+
"Cell n?y ch?y `HVU_QA_tool.py` trong th? m?c demo ? ch? ?? `--prepare-runtime-only` ?? ch?ng minh r?ng tool **kh?ng c?n ph? thu?c v?o full project local**.\n"
|
| 532 |
+
]
|
| 533 |
+
},
|
| 534 |
+
{
|
| 535 |
+
"cell_type": "code",
|
| 536 |
+
"metadata": {},
|
| 537 |
+
"execution_count": null,
|
| 538 |
+
"outputs": [],
|
| 539 |
+
"source": [
|
| 540 |
+
"run_command([\n",
|
| 541 |
+
" str(VENV_PYTHON),\n",
|
| 542 |
+
" str(QUICK_TOOL_DIR / 'HVU_QA_tool.py'),\n",
|
| 543 |
+
" '--prepare-runtime-only',\n",
|
| 544 |
+
"], cwd=QUICK_TOOL_DIR)\n"
|
| 545 |
+
]
|
| 546 |
+
},
|
| 547 |
+
{
|
| 548 |
+
"cell_type": "markdown",
|
| 549 |
+
"metadata": {},
|
| 550 |
+
"source": [
|
| 551 |
+
"## B3. Ki?m tra runtime ???c t?o ra\n",
|
| 552 |
+
"\n",
|
| 553 |
+
"Cell n?y li?t k? c?c file runtime t?i thi?u m? launcher ?? t? d?ng.\n"
|
| 554 |
+
]
|
| 555 |
+
},
|
| 556 |
+
{
|
| 557 |
+
"cell_type": "code",
|
| 558 |
+
"metadata": {},
|
| 559 |
+
"execution_count": null,
|
| 560 |
+
"outputs": [],
|
| 561 |
+
"source": [
|
| 562 |
+
"print_title('C?c file runtime standalone')\n",
|
| 563 |
+
"for path in sorted(QUICK_TOOL_RUNTIME.rglob('*')):\n",
|
| 564 |
+
" if path.is_file():\n",
|
| 565 |
+
" print(path.relative_to(QUICK_TOOL_DIR).as_posix())\n"
|
| 566 |
+
]
|
| 567 |
+
},
|
| 568 |
+
{
|
| 569 |
+
"cell_type": "markdown",
|
| 570 |
+
"metadata": {},
|
| 571 |
+
"source": [
|
| 572 |
+
"## B4. L?nh th?t m? ng??i d?ng cu?i s? ch?y\n",
|
| 573 |
+
"\n",
|
| 574 |
+
"Trong th?c t?, ng??i d?ng ch? c?n ??t `HVU_QA_tool.py` ho?c c? c?p `HVU_QA_tool.py` + `HVU_QA_tool.bat` v?o m?t th? m?c r?i ch?y m?t trong c?c l?nh sau.\n"
|
| 575 |
+
]
|
| 576 |
+
},
|
| 577 |
+
{
|
| 578 |
+
"cell_type": "code",
|
| 579 |
+
"metadata": {},
|
| 580 |
+
"execution_count": null,
|
| 581 |
+
"outputs": [],
|
| 582 |
+
"source": [
|
| 583 |
+
"print_title('L?nh quick tool')\n",
|
| 584 |
+
"print('python HVU_QA_tool.py')\n",
|
| 585 |
+
"print('')\n",
|
| 586 |
+
"print('Ho?c tr?n Windows: double-click HVU_QA_tool.bat')\n",
|
| 587 |
+
"print('')\n",
|
| 588 |
+
"print('Launcher s? t? t?o:')\n",
|
| 589 |
+
"print('- .hvu_qa_tool_venv/ n?u m?y ch?a ? trong virtualenv')\n",
|
| 590 |
+
"print('- HVU_QA_runtime/ n?u th? m?c hi?n t?i ch?a c? full project')\n",
|
| 591 |
+
"print('- t5-viet-qg-finetuned/ trong runtime ?? ch?a model')\n"
|
| 592 |
+
]
|
| 593 |
+
},
|
| 594 |
+
{
|
| 595 |
+
"cell_type": "markdown",
|
| 596 |
+
"metadata": {},
|
| 597 |
+
"source": [
|
| 598 |
+
"## B5. Ch?y th?t ch? ?? quick tool\n",
|
| 599 |
+
"\n",
|
| 600 |
+
"Cell d??i ??y ?? **t?y ch?n**. M?c ??nh notebook s? kh?ng ch?y th?t v? b??c n?y c? th? t?i dependencies v? model t? Hugging Face.\n",
|
| 601 |
+
"\n",
|
| 602 |
+
"L?u ?:\n",
|
| 603 |
+
"- `--best-model-only` ch? d?ng ???c khi repo tr?n Hugging Face th?t s? c? `best-model`.\n",
|
| 604 |
+
"- N?u repo hi?n t?i ch? c? model g?c, launcher s? b?o l?i r? r?ng khi b?n ?p `--best-model-only`.\n"
|
| 605 |
+
]
|
| 606 |
+
},
|
| 607 |
+
{
|
| 608 |
+
"cell_type": "code",
|
| 609 |
+
"metadata": {},
|
| 610 |
+
"execution_count": null,
|
| 611 |
+
"outputs": [],
|
| 612 |
+
"source": [
|
| 613 |
+
"RUN_QUICK_TOOL_NOW = False\n",
|
| 614 |
+
"\n",
|
| 615 |
+
"quick_tool_command = [\n",
|
| 616 |
+
" str(VENV_PYTHON),\n",
|
| 617 |
+
" str(QUICK_TOOL_DIR / 'HVU_QA_tool.py'),\n",
|
| 618 |
+
"]\n",
|
| 619 |
+
"\n",
|
| 620 |
+
"if RUN_QUICK_TOOL_NOW:\n",
|
| 621 |
+
" run_command(quick_tool_command, cwd=QUICK_TOOL_DIR)\n",
|
| 622 |
+
"else:\n",
|
| 623 |
+
" print('B? qua ch?y th?t ?? tr?nh t?i dependency/model ngo?i ? mu?n.')\n",
|
| 624 |
+
" print('Khi c?n, h?y ??t RUN_QUICK_TOOL_NOW = True r?i ch?y l?i cell n?y.')\n",
|
| 625 |
+
" print('L?nh s? ch?y:')\n",
|
| 626 |
+
" print(' '.join(quick_tool_command))\n"
|
| 627 |
+
]
|
| 628 |
+
}
|
| 629 |
+
],
|
| 630 |
+
"metadata": {
|
| 631 |
+
"kernelspec": {
|
| 632 |
+
"display_name": "Python 3",
|
| 633 |
+
"language": "python",
|
| 634 |
+
"name": "python3"
|
| 635 |
+
},
|
| 636 |
+
"language_info": {
|
| 637 |
+
"name": "python",
|
| 638 |
+
"version": "3.11"
|
| 639 |
+
}
|
| 640 |
+
},
|
| 641 |
+
"nbformat": 4,
|
| 642 |
+
"nbformat_minor": 5
|
| 643 |
+
}
|
HVU_QA/HVU_QA_tool.bat
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
@echo off
|
| 2 |
+
setlocal
|
| 3 |
+
cd /d "%~dp0"
|
| 4 |
+
chcp 65001 >nul 2>&1
|
| 5 |
+
|
| 6 |
+
echo [HVU_QA_tool] Dang khoi dong launcher...
|
| 7 |
+
echo [HVU_QA_tool] Runtime va virtualenv se duoc tao tu dong neu can.
|
| 8 |
+
echo.
|
| 9 |
+
|
| 10 |
+
if exist ".\venv\Scripts\python.exe" (
|
| 11 |
+
call ".\venv\Scripts\python.exe" HVU_QA_tool.py %*
|
| 12 |
+
set "EXIT_CODE=%ERRORLEVEL%"
|
| 13 |
+
goto :done
|
| 14 |
+
)
|
| 15 |
+
|
| 16 |
+
where py >nul 2>&1
|
| 17 |
+
if not errorlevel 1 (
|
| 18 |
+
call py -3 HVU_QA_tool.py %*
|
| 19 |
+
set "EXIT_CODE=%ERRORLEVEL%"
|
| 20 |
+
goto :done
|
| 21 |
+
)
|
| 22 |
+
|
| 23 |
+
where python >nul 2>&1
|
| 24 |
+
if not errorlevel 1 (
|
| 25 |
+
call python HVU_QA_tool.py %*
|
| 26 |
+
set "EXIT_CODE=%ERRORLEVEL%"
|
| 27 |
+
goto :done
|
| 28 |
+
)
|
| 29 |
+
|
| 30 |
+
echo Khong tim thay Python tren may.
|
| 31 |
+
echo Hay cai Python 3.11+ hoac tao san venv trong thu muc du an.
|
| 32 |
+
set "EXIT_CODE=1"
|
| 33 |
+
|
| 34 |
+
:done
|
| 35 |
+
echo.
|
| 36 |
+
if not "%EXIT_CODE%"=="0" (
|
| 37 |
+
echo [HVU_QA_tool] Co loi xay ra. Ma loi: %EXIT_CODE%
|
| 38 |
+
) else (
|
| 39 |
+
echo [HVU_QA_tool] Da ket thuc.
|
| 40 |
+
)
|
| 41 |
+
echo.
|
| 42 |
+
pause
|
| 43 |
+
exit /b %EXIT_CODE%
|
HVU_QA/HVU_QA_tool.py
ADDED
|
@@ -0,0 +1,2003 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from __future__ import annotations
|
| 2 |
+
|
| 3 |
+
import argparse
|
| 4 |
+
import fnmatch
|
| 5 |
+
import importlib.util
|
| 6 |
+
import os
|
| 7 |
+
import shutil
|
| 8 |
+
import subprocess
|
| 9 |
+
import sys
|
| 10 |
+
import textwrap
|
| 11 |
+
from dataclasses import dataclass
|
| 12 |
+
from pathlib import Path
|
| 13 |
+
|
| 14 |
+
SCRIPT_ROOT = Path(__file__).resolve().parent
|
| 15 |
+
IS_WINDOWS = os.name == "nt"
|
| 16 |
+
TOOL_VENV_DIR = SCRIPT_ROOT / ".hvu_qa_tool_venv"
|
| 17 |
+
TOOL_VENV_PYTHON = TOOL_VENV_DIR / ("Scripts/python.exe" if IS_WINDOWS else "bin/python")
|
| 18 |
+
|
| 19 |
+
HF_DATASET_REPO_ID = "DANGDOCAO/GeneratingQuestions"
|
| 20 |
+
HF_DATASET_REVISION = "main"
|
| 21 |
+
HF_PROJECT_SUBDIR = "HVU_QA"
|
| 22 |
+
HF_MODEL_SUBDIR = f"{HF_PROJECT_SUBDIR}/t5-viet-qg-finetuned"
|
| 23 |
+
HF_BEST_MODEL_SUBDIR = f"{HF_MODEL_SUBDIR}/best-model"
|
| 24 |
+
|
| 25 |
+
HF_HUB_REQUIREMENT = "huggingface_hub>=0.23.0,<1.0.0"
|
| 26 |
+
RUNTIME_REQUIREMENTS = [
|
| 27 |
+
"Flask>=3.0.0,<4.0.0",
|
| 28 |
+
HF_HUB_REQUIREMENT,
|
| 29 |
+
"numpy>=1.26.0,<3.0.0",
|
| 30 |
+
"safetensors>=0.4.3,<1.0.0",
|
| 31 |
+
"sentencepiece>=0.2.0,<1.0.0",
|
| 32 |
+
"torch>=2.2.0,<3.0.0",
|
| 33 |
+
"transformers>=4.41.0,<5.0.0",
|
| 34 |
+
]
|
| 35 |
+
LOCAL_PROJECT_MARKERS = [
|
| 36 |
+
"main.py",
|
| 37 |
+
"backend/app.py",
|
| 38 |
+
"frontend/index.html",
|
| 39 |
+
"generate_question.py",
|
| 40 |
+
]
|
| 41 |
+
DEPENDENCY_IMPORTS = {
|
| 42 |
+
"Flask": "flask",
|
| 43 |
+
"numpy": "numpy",
|
| 44 |
+
"torch": "torch",
|
| 45 |
+
"transformers": "transformers",
|
| 46 |
+
"sentencepiece": "sentencepiece",
|
| 47 |
+
"safetensors": "safetensors",
|
| 48 |
+
"huggingface_hub": "huggingface_hub",
|
| 49 |
+
}
|
| 50 |
+
MODEL_IGNORE_PATTERNS = [
|
| 51 |
+
f"{HF_MODEL_SUBDIR}/checkpoint-*/**",
|
| 52 |
+
f"{HF_MODEL_SUBDIR}/all_results.json",
|
| 53 |
+
f"{HF_MODEL_SUBDIR}/eval_results.json",
|
| 54 |
+
f"{HF_MODEL_SUBDIR}/train_results.json",
|
| 55 |
+
f"{HF_MODEL_SUBDIR}/trainer_state.json",
|
| 56 |
+
f"{HF_MODEL_SUBDIR}/training_summary.json",
|
| 57 |
+
f"{HF_MODEL_SUBDIR}/training_args.bin",
|
| 58 |
+
f"{HF_BEST_MODEL_SUBDIR}/training_args.bin",
|
| 59 |
+
]
|
| 60 |
+
|
| 61 |
+
|
| 62 |
+
@dataclass(frozen=True)
|
| 63 |
+
class RuntimeContext:
|
| 64 |
+
root: Path
|
| 65 |
+
main_file: Path
|
| 66 |
+
requirements_file: Path
|
| 67 |
+
local_model_dir: Path
|
| 68 |
+
local_best_model_dir: Path
|
| 69 |
+
standalone_mode: bool
|
| 70 |
+
|
| 71 |
+
|
| 72 |
+
def print_step(message: str) -> None:
|
| 73 |
+
print(f"[HVU_QA_tool] {message}")
|
| 74 |
+
|
| 75 |
+
|
| 76 |
+
def module_exists(module_name: str) -> bool:
|
| 77 |
+
return importlib.util.find_spec(module_name) is not None
|
| 78 |
+
|
| 79 |
+
|
| 80 |
+
def run_command(
|
| 81 |
+
command: list[str],
|
| 82 |
+
*,
|
| 83 |
+
cwd: Path | None = None,
|
| 84 |
+
env: dict[str, str] | None = None,
|
| 85 |
+
) -> None:
|
| 86 |
+
subprocess.check_call(command, cwd=str(cwd) if cwd else None, env=env)
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
def is_running_in_virtualenv() -> bool:
|
| 90 |
+
return sys.prefix != getattr(sys, "base_prefix", sys.prefix) or bool(os.getenv("VIRTUAL_ENV"))
|
| 91 |
+
|
| 92 |
+
|
| 93 |
+
def format_bytes(size: int) -> str:
|
| 94 |
+
units = ["B", "KB", "MB", "GB", "TB"]
|
| 95 |
+
value = float(size)
|
| 96 |
+
for unit in units:
|
| 97 |
+
if value < 1024 or unit == units[-1]:
|
| 98 |
+
if unit == "B":
|
| 99 |
+
return f"{int(value)} {unit}"
|
| 100 |
+
return f"{value:.1f} {unit}"
|
| 101 |
+
value /= 1024
|
| 102 |
+
return f"{size} B"
|
| 103 |
+
|
| 104 |
+
|
| 105 |
+
def render_progress_bar(current: int, total: int, width: int = 28) -> str:
|
| 106 |
+
if total <= 0:
|
| 107 |
+
return "[----------------------------] 0.0%"
|
| 108 |
+
|
| 109 |
+
ratio = max(0.0, min(1.0, current / total))
|
| 110 |
+
filled = int(ratio * width)
|
| 111 |
+
bar = "#" * filled + "-" * (width - filled)
|
| 112 |
+
percent = ratio * 100
|
| 113 |
+
return f"[{bar}] {percent:5.1f}%"
|
| 114 |
+
|
| 115 |
+
|
| 116 |
+
def matches_any_pattern(path: str, patterns: list[str]) -> bool:
|
| 117 |
+
normalized = path.replace("\\", "/")
|
| 118 |
+
return any(fnmatch.fnmatch(normalized, pattern) for pattern in patterns)
|
| 119 |
+
|
| 120 |
+
|
| 121 |
+
def build_allow_patterns(best_model_only: bool) -> list[str]:
|
| 122 |
+
if best_model_only:
|
| 123 |
+
return [f"{HF_BEST_MODEL_SUBDIR}/**"]
|
| 124 |
+
return [f"{HF_MODEL_SUBDIR}/**"]
|
| 125 |
+
|
| 126 |
+
|
| 127 |
+
def has_local_project(root: Path) -> bool:
|
| 128 |
+
return all((root / marker).exists() for marker in LOCAL_PROJECT_MARKERS)
|
| 129 |
+
|
| 130 |
+
|
| 131 |
+
def build_runtime_requirements_text() -> str:
|
| 132 |
+
lines = [
|
| 133 |
+
"# Runtime dependencies for standalone HVU_QA launcher.",
|
| 134 |
+
"# Nếu dùng GPU NVIDIA, hãy cài đúng bản torch theo CUDA của máy nếu cần.",
|
| 135 |
+
*RUNTIME_REQUIREMENTS,
|
| 136 |
+
"",
|
| 137 |
+
]
|
| 138 |
+
return "\n".join(lines)
|
| 139 |
+
|
| 140 |
+
|
| 141 |
+
def build_runtime_file_map() -> dict[str, str]:
|
| 142 |
+
requirements_text = build_runtime_requirements_text()
|
| 143 |
+
return {
|
| 144 |
+
"requirements.txt": requirements_text,
|
| 145 |
+
"main.py": textwrap.dedent(
|
| 146 |
+
"""
|
| 147 |
+
from __future__ import annotations
|
| 148 |
+
|
| 149 |
+
import os
|
| 150 |
+
import threading
|
| 151 |
+
import webbrowser
|
| 152 |
+
|
| 153 |
+
from backend import create_app
|
| 154 |
+
|
| 155 |
+
app = create_app()
|
| 156 |
+
|
| 157 |
+
|
| 158 |
+
def _as_bool(value: str | None, default: bool) -> bool:
|
| 159 |
+
if value is None:
|
| 160 |
+
return default
|
| 161 |
+
return value.strip().lower() not in {"0", "false", "no", "off"}
|
| 162 |
+
|
| 163 |
+
|
| 164 |
+
def _open_browser_later(host: str, port: int) -> None:
|
| 165 |
+
if not _as_bool(os.getenv("HVU_OPEN_BROWSER"), True):
|
| 166 |
+
return
|
| 167 |
+
target_host = "127.0.0.1" if host in {"0.0.0.0", "::"} else host
|
| 168 |
+
url = f"http://{target_host}:{port}"
|
| 169 |
+
threading.Timer(1.2, lambda: webbrowser.open(url)).start()
|
| 170 |
+
|
| 171 |
+
|
| 172 |
+
if __name__ == "__main__":
|
| 173 |
+
host = os.getenv("HVU_HOST", "127.0.0.1")
|
| 174 |
+
port = int(os.getenv("HVU_PORT", "5000"))
|
| 175 |
+
debug = _as_bool(os.getenv("HVU_DEBUG"), False)
|
| 176 |
+
_open_browser_later(host, port)
|
| 177 |
+
app.run(host=host, port=port, debug=debug, use_reloader=False)
|
| 178 |
+
"""
|
| 179 |
+
).strip()
|
| 180 |
+
+ "\n",
|
| 181 |
+
"backend/__init__.py": 'from .app import create_app\n\n__all__ = ["create_app"]\n',
|
| 182 |
+
"backend/app.py": textwrap.dedent(
|
| 183 |
+
"""
|
| 184 |
+
from __future__ import annotations
|
| 185 |
+
|
| 186 |
+
import os
|
| 187 |
+
import time
|
| 188 |
+
from pathlib import Path
|
| 189 |
+
|
| 190 |
+
from flask import Flask, jsonify, request, send_from_directory
|
| 191 |
+
|
| 192 |
+
from generate_question import (
|
| 193 |
+
APP_TITLE,
|
| 194 |
+
QUESTION_LIMIT,
|
| 195 |
+
QuestionGenerator,
|
| 196 |
+
format_questions,
|
| 197 |
+
normalize_text,
|
| 198 |
+
parse_question_count,
|
| 199 |
+
resolve_model_dir,
|
| 200 |
+
)
|
| 201 |
+
|
| 202 |
+
IGNORED_MODEL_DIR_NAMES = {
|
| 203 |
+
".git",
|
| 204 |
+
".vscode",
|
| 205 |
+
"__pycache__",
|
| 206 |
+
"backend",
|
| 207 |
+
"frontend",
|
| 208 |
+
"venv",
|
| 209 |
+
".hvu_qa_tool_venv",
|
| 210 |
+
"HVU_QA_runtime",
|
| 211 |
+
}
|
| 212 |
+
|
| 213 |
+
|
| 214 |
+
def project_root() -> Path:
|
| 215 |
+
return Path(__file__).resolve().parents[1]
|
| 216 |
+
|
| 217 |
+
|
| 218 |
+
def _read_optional_int(value: str | None) -> int | None:
|
| 219 |
+
if value in (None, ""):
|
| 220 |
+
return None
|
| 221 |
+
return int(value)
|
| 222 |
+
|
| 223 |
+
|
| 224 |
+
def build_generator(
|
| 225 |
+
model_dir: str | Path | None = None,
|
| 226 |
+
prefer_nested_model: bool = True,
|
| 227 |
+
) -> QuestionGenerator:
|
| 228 |
+
root = project_root()
|
| 229 |
+
selected_model_dir = (
|
| 230 |
+
Path(model_dir).expanduser()
|
| 231 |
+
if model_dir is not None
|
| 232 |
+
else Path(os.getenv("HVU_MODEL_DIR", str(root / "t5-viet-qg-finetuned"))).expanduser()
|
| 233 |
+
)
|
| 234 |
+
if not selected_model_dir.is_absolute():
|
| 235 |
+
selected_model_dir = root / selected_model_dir
|
| 236 |
+
|
| 237 |
+
return QuestionGenerator(
|
| 238 |
+
model_dir=str(selected_model_dir),
|
| 239 |
+
task_prefix=os.getenv("HVU_TASK_PREFIX", "sinh câu hỏi"),
|
| 240 |
+
max_source_length=int(os.getenv("HVU_MAX_SOURCE_LENGTH", "512")),
|
| 241 |
+
max_new_tokens=int(os.getenv("HVU_MAX_NEW_TOKENS", "64")),
|
| 242 |
+
device=os.getenv("HVU_DEVICE", "auto"),
|
| 243 |
+
cpu_threads=_read_optional_int(os.getenv("HVU_CPU_THREADS")),
|
| 244 |
+
gpu_dtype=os.getenv("HVU_GPU_DTYPE", "auto"),
|
| 245 |
+
prefer_nested_model=prefer_nested_model,
|
| 246 |
+
)
|
| 247 |
+
|
| 248 |
+
|
| 249 |
+
def _model_label(relative_path: str | Path) -> str:
|
| 250 |
+
path = Path(relative_path)
|
| 251 |
+
return path.name or "model"
|
| 252 |
+
|
| 253 |
+
|
| 254 |
+
def _iter_model_candidates(root: Path):
|
| 255 |
+
for child in sorted(root.iterdir(), key=lambda path: path.name.lower()):
|
| 256 |
+
if not child.is_dir() or child.name.startswith(".") or child.name in IGNORED_MODEL_DIR_NAMES:
|
| 257 |
+
continue
|
| 258 |
+
|
| 259 |
+
if (child / "config.json").exists():
|
| 260 |
+
yield {"path": child, "prefer_nested_model": False}
|
| 261 |
+
|
| 262 |
+
for nested_name in ("best-model", "final-model"):
|
| 263 |
+
nested = child / nested_name
|
| 264 |
+
if nested.is_dir() and (nested / "config.json").exists():
|
| 265 |
+
yield {"path": nested, "prefer_nested_model": False}
|
| 266 |
+
|
| 267 |
+
|
| 268 |
+
def _discover_available_models(
|
| 269 |
+
root: Path,
|
| 270 |
+
active_generator: QuestionGenerator | None = None,
|
| 271 |
+
) -> list[dict[str, str]]:
|
| 272 |
+
models: list[dict[str, str]] = []
|
| 273 |
+
seen_roots: set[str] = set()
|
| 274 |
+
root = root.resolve()
|
| 275 |
+
|
| 276 |
+
for candidate_info in _iter_model_candidates(root):
|
| 277 |
+
candidate = candidate_info["path"]
|
| 278 |
+
model_key = str(candidate.resolve())
|
| 279 |
+
if model_key in seen_roots:
|
| 280 |
+
continue
|
| 281 |
+
|
| 282 |
+
try:
|
| 283 |
+
relative_candidate = candidate.resolve().relative_to(root)
|
| 284 |
+
except ValueError:
|
| 285 |
+
continue
|
| 286 |
+
|
| 287 |
+
seen_roots.add(model_key)
|
| 288 |
+
models.append(
|
| 289 |
+
{
|
| 290 |
+
"id": relative_candidate.as_posix(),
|
| 291 |
+
"label": _model_label(relative_candidate),
|
| 292 |
+
"model_root": str(candidate.resolve()),
|
| 293 |
+
"model_dir": str(resolve_model_dir(candidate, prefer_nested_model=False).resolve()),
|
| 294 |
+
"prefer_nested_model": bool(candidate_info["prefer_nested_model"]),
|
| 295 |
+
}
|
| 296 |
+
)
|
| 297 |
+
|
| 298 |
+
if active_generator is not None:
|
| 299 |
+
current_root = active_generator.model_root.resolve()
|
| 300 |
+
current_dir = active_generator.model_dir.resolve()
|
| 301 |
+
exists = any(
|
| 302 |
+
Path(item["model_root"]).resolve() == current_root
|
| 303 |
+
or Path(item["model_dir"]).resolve() == current_dir
|
| 304 |
+
for item in models
|
| 305 |
+
)
|
| 306 |
+
if not exists:
|
| 307 |
+
models.append(
|
| 308 |
+
{
|
| 309 |
+
"id": current_root.as_posix(),
|
| 310 |
+
"label": current_root.name,
|
| 311 |
+
"model_root": str(current_root),
|
| 312 |
+
"model_dir": str(current_dir),
|
| 313 |
+
"prefer_nested_model": False,
|
| 314 |
+
}
|
| 315 |
+
)
|
| 316 |
+
|
| 317 |
+
return models
|
| 318 |
+
|
| 319 |
+
|
| 320 |
+
def _selected_model_id(
|
| 321 |
+
app: Flask,
|
| 322 |
+
models: list[dict[str, str]],
|
| 323 |
+
active_generator: QuestionGenerator | None = None,
|
| 324 |
+
) -> str:
|
| 325 |
+
explicit_selection = str(app.config.get("SELECTED_MODEL_ID") or "").strip()
|
| 326 |
+
if explicit_selection and any(item["id"] == explicit_selection for item in models):
|
| 327 |
+
return explicit_selection
|
| 328 |
+
|
| 329 |
+
active_generator = active_generator or _generator(app)
|
| 330 |
+
current_root = active_generator.model_root.resolve()
|
| 331 |
+
current_dir = active_generator.model_dir.resolve()
|
| 332 |
+
|
| 333 |
+
for item in models:
|
| 334 |
+
if Path(item["model_dir"]).resolve() == current_dir:
|
| 335 |
+
return item["id"]
|
| 336 |
+
|
| 337 |
+
for item in models:
|
| 338 |
+
if Path(item["model_root"]).resolve() == current_root:
|
| 339 |
+
return item["id"]
|
| 340 |
+
|
| 341 |
+
return models[0]["id"] if models else ""
|
| 342 |
+
|
| 343 |
+
|
| 344 |
+
def _switch_generator(app: Flask, model_id: str) -> QuestionGenerator:
|
| 345 |
+
available_models = _discover_available_models(app.config["PROJECT_ROOT"], _generator(app))
|
| 346 |
+
selected_model = next((item for item in available_models if item["id"] == model_id), None)
|
| 347 |
+
if selected_model is None:
|
| 348 |
+
raise ValueError("Model được chọn không hợp lệ hoặc chưa tồn tại trong thư mục runtime.")
|
| 349 |
+
|
| 350 |
+
current_model_id = _selected_model_id(app, available_models)
|
| 351 |
+
if current_model_id != model_id:
|
| 352 |
+
app.config["GENERATOR"] = build_generator(
|
| 353 |
+
selected_model["model_root"],
|
| 354 |
+
prefer_nested_model=bool(selected_model.get("prefer_nested_model")),
|
| 355 |
+
)
|
| 356 |
+
|
| 357 |
+
app.config["SELECTED_MODEL_ID"] = model_id
|
| 358 |
+
return _generator(app)
|
| 359 |
+
|
| 360 |
+
|
| 361 |
+
def _info_payload(app: Flask, active_generator: QuestionGenerator | None = None) -> dict[str, object]:
|
| 362 |
+
active_generator = active_generator or _generator(app)
|
| 363 |
+
available_models = _discover_available_models(app.config["PROJECT_ROOT"], active_generator)
|
| 364 |
+
selected_model_id = _selected_model_id(app, available_models, active_generator)
|
| 365 |
+
model_name = next(
|
| 366 |
+
(item["label"] for item in available_models if item["id"] == selected_model_id),
|
| 367 |
+
Path(active_generator.model_dir).name,
|
| 368 |
+
)
|
| 369 |
+
return {
|
| 370 |
+
"ok": True,
|
| 371 |
+
"title": APP_TITLE,
|
| 372 |
+
"model_name": model_name,
|
| 373 |
+
"selected_model_id": selected_model_id,
|
| 374 |
+
"available_models": [{"id": item["id"], "label": item["label"]} for item in available_models],
|
| 375 |
+
"meta": active_generator.metadata(),
|
| 376 |
+
}
|
| 377 |
+
|
| 378 |
+
|
| 379 |
+
def create_app(generator: QuestionGenerator | None = None) -> Flask:
|
| 380 |
+
root = project_root()
|
| 381 |
+
frontend_root = root / "frontend"
|
| 382 |
+
|
| 383 |
+
app = Flask(__name__, static_folder=None)
|
| 384 |
+
app.json.ensure_ascii = False
|
| 385 |
+
app.config["GENERATOR"] = generator or build_generator()
|
| 386 |
+
app.config["PROJECT_ROOT"] = root
|
| 387 |
+
app.config["FRONTEND_ROOT"] = frontend_root
|
| 388 |
+
app.config["SELECTED_MODEL_ID"] = ""
|
| 389 |
+
|
| 390 |
+
@app.get("/")
|
| 391 |
+
def index():
|
| 392 |
+
return send_from_directory(app.config["FRONTEND_ROOT"], "index.html")
|
| 393 |
+
|
| 394 |
+
@app.get("/frontend/<path:filename>")
|
| 395 |
+
def frontend_file(filename: str):
|
| 396 |
+
return send_from_directory(app.config["FRONTEND_ROOT"], filename)
|
| 397 |
+
|
| 398 |
+
@app.get("/api/info")
|
| 399 |
+
def info():
|
| 400 |
+
return jsonify(_info_payload(app))
|
| 401 |
+
|
| 402 |
+
@app.post("/api/model")
|
| 403 |
+
def set_model():
|
| 404 |
+
payload = request.get_json(silent=True) or {}
|
| 405 |
+
model_id = str(payload.get("model_id") or "").strip()
|
| 406 |
+
if not model_id:
|
| 407 |
+
return jsonify({"ok": False, "error": "Vui lòng chọn model trước khi chuyển."}), 400
|
| 408 |
+
|
| 409 |
+
try:
|
| 410 |
+
active_generator = _switch_generator(app, model_id)
|
| 411 |
+
except ValueError as exc:
|
| 412 |
+
return jsonify({"ok": False, "error": str(exc)}), 404
|
| 413 |
+
|
| 414 |
+
return jsonify(_info_payload(app, active_generator))
|
| 415 |
+
|
| 416 |
+
@app.post("/api/generate")
|
| 417 |
+
def generate():
|
| 418 |
+
payload = request.get_json(silent=True) or {}
|
| 419 |
+
requested_model_id = str(payload.get("model_id") or "").strip()
|
| 420 |
+
|
| 421 |
+
if requested_model_id:
|
| 422 |
+
try:
|
| 423 |
+
active_generator = _switch_generator(app, requested_model_id)
|
| 424 |
+
except ValueError as exc:
|
| 425 |
+
return jsonify({"ok": False, "error": str(exc)}), 400
|
| 426 |
+
else:
|
| 427 |
+
active_generator = _generator(app)
|
| 428 |
+
|
| 429 |
+
text = normalize_text(payload.get("text"))
|
| 430 |
+
if not text:
|
| 431 |
+
return jsonify({"ok": False, "error": "Vui lòng nhập đoạn văn bản trước khi sinh câu hỏi."}), 400
|
| 432 |
+
|
| 433 |
+
raw_count = payload.get("num_questions")
|
| 434 |
+
if raw_count in (None, ""):
|
| 435 |
+
count = 5
|
| 436 |
+
else:
|
| 437 |
+
try:
|
| 438 |
+
count = int(raw_count)
|
| 439 |
+
except (TypeError, ValueError):
|
| 440 |
+
return jsonify({"ok": False, "error": "Số câu hỏi phải là số nguyên trong khoảng 1 đến 100."}), 400
|
| 441 |
+
|
| 442 |
+
if count < 1 or count > QUESTION_LIMIT:
|
| 443 |
+
return jsonify({"ok": False, "error": f"Số câu hỏi phải nằm trong khoảng 1 đến {QUESTION_LIMIT}."}), 400
|
| 444 |
+
|
| 445 |
+
started = time.perf_counter()
|
| 446 |
+
try:
|
| 447 |
+
questions = active_generator.generate(text, parse_question_count(count))
|
| 448 |
+
except Exception as exc: # noqa: BLE001
|
| 449 |
+
return jsonify({"ok": False, "error": str(exc)}), 500
|
| 450 |
+
|
| 451 |
+
elapsed_ms = round((time.perf_counter() - started) * 1000, 2)
|
| 452 |
+
info_payload = _info_payload(app, active_generator)
|
| 453 |
+
return jsonify(
|
| 454 |
+
{
|
| 455 |
+
"ok": True,
|
| 456 |
+
"text": text,
|
| 457 |
+
"num_questions": count,
|
| 458 |
+
"questions": questions,
|
| 459 |
+
"formatted": format_questions(questions),
|
| 460 |
+
"elapsed_ms": elapsed_ms,
|
| 461 |
+
"model_name": info_payload["model_name"],
|
| 462 |
+
"selected_model_id": info_payload["selected_model_id"],
|
| 463 |
+
"meta": active_generator.metadata(),
|
| 464 |
+
}
|
| 465 |
+
)
|
| 466 |
+
|
| 467 |
+
return app
|
| 468 |
+
|
| 469 |
+
|
| 470 |
+
def _generator(app: Flask) -> QuestionGenerator:
|
| 471 |
+
generator: QuestionGenerator = app.config["GENERATOR"]
|
| 472 |
+
return generator
|
| 473 |
+
"""
|
| 474 |
+
).strip()
|
| 475 |
+
+ "\n",
|
| 476 |
+
"generate_question.py": textwrap.dedent(
|
| 477 |
+
"""
|
| 478 |
+
from __future__ import annotations
|
| 479 |
+
|
| 480 |
+
import argparse
|
| 481 |
+
import json
|
| 482 |
+
import os
|
| 483 |
+
import re
|
| 484 |
+
import sys
|
| 485 |
+
import threading
|
| 486 |
+
from pathlib import Path
|
| 487 |
+
from typing import Any
|
| 488 |
+
|
| 489 |
+
os.environ.setdefault("TOKENIZERS_PARALLELISM", "false")
|
| 490 |
+
os.environ.setdefault("PYTORCH_CUDA_ALLOC_CONF", "expandable_segments:True")
|
| 491 |
+
|
| 492 |
+
|
| 493 |
+
def raise_missing_dependency_error(exc: ModuleNotFoundError) -> None:
|
| 494 |
+
root = Path(__file__).resolve().parent
|
| 495 |
+
requirements = root / "requirements.txt"
|
| 496 |
+
message = [
|
| 497 |
+
f"Thiếu thư viện Python: {exc.name}",
|
| 498 |
+
f"Interpreter hiện tại: {sys.executable}",
|
| 499 |
+
]
|
| 500 |
+
if requirements.exists():
|
| 501 |
+
message.extend(
|
| 502 |
+
[
|
| 503 |
+
"Cài đặt dependencies bằng lệnh:",
|
| 504 |
+
f"{sys.executable} -m pip install -r {requirements}",
|
| 505 |
+
]
|
| 506 |
+
)
|
| 507 |
+
raise SystemExit("\\n".join(message)) from exc
|
| 508 |
+
|
| 509 |
+
|
| 510 |
+
try:
|
| 511 |
+
import torch
|
| 512 |
+
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
| 513 |
+
except ModuleNotFoundError as exc:
|
| 514 |
+
raise_missing_dependency_error(exc)
|
| 515 |
+
|
| 516 |
+
|
| 517 |
+
APP_TITLE = "HVU_QA Tool - Sinh câu hỏi"
|
| 518 |
+
TASK_PREFIX = "sinh câu hỏi"
|
| 519 |
+
QUESTION_LIMIT = 100
|
| 520 |
+
GENERATION_PASSES = (
|
| 521 |
+
(0.9, 0.95, 1, 4),
|
| 522 |
+
(1.0, 0.97, 1, 5),
|
| 523 |
+
(1.08, 0.99, 2, 6),
|
| 524 |
+
)
|
| 525 |
+
|
| 526 |
+
|
| 527 |
+
def normalize_text(text: Any) -> str:
|
| 528 |
+
return " ".join(str(text or "").split())
|
| 529 |
+
|
| 530 |
+
|
| 531 |
+
def unique_text(items: list[str]) -> list[str]:
|
| 532 |
+
seen: set[str] = set()
|
| 533 |
+
output: list[str] = []
|
| 534 |
+
for item in items:
|
| 535 |
+
value = normalize_text(item)
|
| 536 |
+
key = value.lower()
|
| 537 |
+
if key and key not in seen:
|
| 538 |
+
seen.add(key)
|
| 539 |
+
output.append(value)
|
| 540 |
+
return output
|
| 541 |
+
|
| 542 |
+
|
| 543 |
+
def parse_question_count(value: Any, default: int = 5) -> int:
|
| 544 |
+
try:
|
| 545 |
+
parsed = int(value)
|
| 546 |
+
except (TypeError, ValueError):
|
| 547 |
+
parsed = default
|
| 548 |
+
return max(1, min(parsed, QUESTION_LIMIT))
|
| 549 |
+
|
| 550 |
+
|
| 551 |
+
def format_questions(items: list[str]) -> str:
|
| 552 |
+
if not items:
|
| 553 |
+
return "Không sinh được câu hỏi phù hợp."
|
| 554 |
+
return "\\n".join(f"{index}. {item}" for index, item in enumerate(items, 1))
|
| 555 |
+
|
| 556 |
+
|
| 557 |
+
def resolve_model_dir(model_dir: str | Path, prefer_nested_model: bool = True) -> Path:
|
| 558 |
+
model_root = Path(model_dir).expanduser().resolve()
|
| 559 |
+
nested_candidates = [model_root / "best-model", model_root / "final-model"]
|
| 560 |
+
candidates = [*nested_candidates, model_root] if prefer_nested_model else [model_root, *nested_candidates]
|
| 561 |
+
for candidate in candidates:
|
| 562 |
+
if candidate.is_dir() and (candidate / "config.json").exists():
|
| 563 |
+
return candidate
|
| 564 |
+
raise FileNotFoundError(f"Không tìm thấy thư mục mô hình hợp lệ: {model_root}")
|
| 565 |
+
|
| 566 |
+
|
| 567 |
+
def parse_dtype(value: str) -> torch.dtype:
|
| 568 |
+
normalized = value.strip().lower()
|
| 569 |
+
mapping = {
|
| 570 |
+
"float16": torch.float16,
|
| 571 |
+
"fp16": torch.float16,
|
| 572 |
+
"float32": torch.float32,
|
| 573 |
+
"fp32": torch.float32,
|
| 574 |
+
"bfloat16": torch.bfloat16,
|
| 575 |
+
"bf16": torch.bfloat16,
|
| 576 |
+
}
|
| 577 |
+
if normalized not in mapping:
|
| 578 |
+
raise ValueError(f"Không hỗ trợ gpu_dtype={value}")
|
| 579 |
+
return mapping[normalized]
|
| 580 |
+
|
| 581 |
+
|
| 582 |
+
class QuestionGenerator:
|
| 583 |
+
def __init__(
|
| 584 |
+
self,
|
| 585 |
+
model_dir: str | Path = "t5-viet-qg-finetuned",
|
| 586 |
+
task_prefix: str = TASK_PREFIX,
|
| 587 |
+
max_source_length: int = 512,
|
| 588 |
+
max_new_tokens: int = 64,
|
| 589 |
+
device: str = "auto",
|
| 590 |
+
cpu_threads: int | None = None,
|
| 591 |
+
gpu_dtype: str = "auto",
|
| 592 |
+
prefer_nested_model: bool = True,
|
| 593 |
+
) -> None:
|
| 594 |
+
self.model_root = Path(model_dir).expanduser().resolve()
|
| 595 |
+
self.model_dir = resolve_model_dir(model_dir, prefer_nested_model=prefer_nested_model)
|
| 596 |
+
self.task_prefix = task_prefix
|
| 597 |
+
self.max_source_length = max_source_length
|
| 598 |
+
self.max_new_tokens = max_new_tokens
|
| 599 |
+
self.requested_device = device
|
| 600 |
+
self.cpu_threads = cpu_threads
|
| 601 |
+
self.gpu_dtype = gpu_dtype
|
| 602 |
+
self.device: torch.device | None = None
|
| 603 |
+
self.dtype: torch.dtype | None = None
|
| 604 |
+
self.tokenizer = None
|
| 605 |
+
self.model = None
|
| 606 |
+
self._load_lock = threading.Lock()
|
| 607 |
+
|
| 608 |
+
def _resolve_device(self) -> torch.device:
|
| 609 |
+
requested = self.requested_device.lower()
|
| 610 |
+
if requested == "cpu":
|
| 611 |
+
return torch.device("cpu")
|
| 612 |
+
if requested == "cuda":
|
| 613 |
+
if not torch.cuda.is_available():
|
| 614 |
+
raise RuntimeError("Bạn đã chọn device=cuda nhưng máy hiện tại không có CUDA.")
|
| 615 |
+
return torch.device("cuda")
|
| 616 |
+
return torch.device("cuda" if torch.cuda.is_available() else "cpu")
|
| 617 |
+
|
| 618 |
+
def _resolve_dtype(self) -> torch.dtype:
|
| 619 |
+
if self.device is None or self.device.type != "cuda":
|
| 620 |
+
return torch.float32
|
| 621 |
+
if self.gpu_dtype == "auto":
|
| 622 |
+
if hasattr(torch.cuda, "is_bf16_supported") and torch.cuda.is_bf16_supported():
|
| 623 |
+
return torch.bfloat16
|
| 624 |
+
return torch.float16
|
| 625 |
+
return parse_dtype(self.gpu_dtype)
|
| 626 |
+
|
| 627 |
+
def _configure_runtime(self) -> None:
|
| 628 |
+
if self.device is None:
|
| 629 |
+
return
|
| 630 |
+
if self.device.type == "cpu":
|
| 631 |
+
if self.cpu_threads:
|
| 632 |
+
torch.set_num_threads(max(1, int(self.cpu_threads)))
|
| 633 |
+
if hasattr(torch, "set_num_interop_threads"):
|
| 634 |
+
torch.set_num_interop_threads(max(1, min(int(self.cpu_threads), 4)))
|
| 635 |
+
return
|
| 636 |
+
|
| 637 |
+
if hasattr(torch.backends, "cuda") and hasattr(torch.backends.cuda, "matmul"):
|
| 638 |
+
torch.backends.cuda.matmul.allow_tf32 = True
|
| 639 |
+
if hasattr(torch.backends, "cudnn"):
|
| 640 |
+
torch.backends.cudnn.allow_tf32 = True
|
| 641 |
+
torch.backends.cudnn.benchmark = True
|
| 642 |
+
|
| 643 |
+
def load(self) -> None:
|
| 644 |
+
if self.model is not None and self.tokenizer is not None:
|
| 645 |
+
return
|
| 646 |
+
|
| 647 |
+
with self._load_lock:
|
| 648 |
+
if self.model is not None and self.tokenizer is not None:
|
| 649 |
+
return
|
| 650 |
+
|
| 651 |
+
self.device = self._resolve_device()
|
| 652 |
+
self.dtype = self._resolve_dtype()
|
| 653 |
+
self._configure_runtime()
|
| 654 |
+
|
| 655 |
+
model_kwargs: dict[str, Any] = {}
|
| 656 |
+
if self.device.type == "cuda":
|
| 657 |
+
model_kwargs["torch_dtype"] = self.dtype
|
| 658 |
+
model_kwargs["low_cpu_mem_usage"] = True
|
| 659 |
+
|
| 660 |
+
self.tokenizer = AutoTokenizer.from_pretrained(str(self.model_dir), use_fast=True)
|
| 661 |
+
self.model = AutoModelForSeq2SeqLM.from_pretrained(str(self.model_dir), **model_kwargs)
|
| 662 |
+
self.model.to(self.device)
|
| 663 |
+
self.model.eval()
|
| 664 |
+
|
| 665 |
+
def metadata(self) -> dict[str, Any]:
|
| 666 |
+
active_device = self.device.type if self.device is not None else None
|
| 667 |
+
predicted_device = "cuda" if torch.cuda.is_available() and self.requested_device != "cpu" else "cpu"
|
| 668 |
+
return {
|
| 669 |
+
"title": APP_TITLE,
|
| 670 |
+
"model_root": str(self.model_root),
|
| 671 |
+
"model_dir": str(self.model_dir),
|
| 672 |
+
"requested_device": self.requested_device,
|
| 673 |
+
"active_device": active_device,
|
| 674 |
+
"predicted_device": predicted_device,
|
| 675 |
+
"loaded": self.model is not None,
|
| 676 |
+
"gpu_available": torch.cuda.is_available(),
|
| 677 |
+
"gpu_dtype": None if self.dtype is None else str(self.dtype).replace("torch.", ""),
|
| 678 |
+
"cpu_threads": torch.get_num_threads(),
|
| 679 |
+
}
|
| 680 |
+
|
| 681 |
+
def _candidate_answers(self, text: str, limit: int) -> list[str]:
|
| 682 |
+
text = normalize_text(text)
|
| 683 |
+
if not text:
|
| 684 |
+
return []
|
| 685 |
+
|
| 686 |
+
candidates: list[str] = []
|
| 687 |
+
split_pattern = r"(?<=[.!?])\\s+|\\n+"
|
| 688 |
+
for sentence in [normalize_text(part) for part in re.split(split_pattern, text) if normalize_text(part)]:
|
| 689 |
+
if 3 <= len(sentence.split()) <= 30:
|
| 690 |
+
candidates.append(sentence)
|
| 691 |
+
for clause in (normalize_text(part) for part in re.split(r"\\s*[,;:]\\s*", sentence)):
|
| 692 |
+
if 3 <= len(clause.split()) <= 20:
|
| 693 |
+
candidates.append(clause)
|
| 694 |
+
|
| 695 |
+
if not candidates:
|
| 696 |
+
words = text.split()
|
| 697 |
+
candidates = [" ".join(words[: min(12, len(words))])] if words else [text]
|
| 698 |
+
|
| 699 |
+
ranked = sorted(unique_text(candidates), key=lambda item: (abs(len(item.split()) - 10), len(item)))
|
| 700 |
+
return ranked[:limit]
|
| 701 |
+
|
| 702 |
+
def _build_prompt(self, context: str, answer: str) -> str:
|
| 703 |
+
return f"{self.task_prefix}:\\nngữ cảnh: {context}\\nđáp án: {answer}"
|
| 704 |
+
|
| 705 |
+
@torch.inference_mode()
|
| 706 |
+
def _sample(self, context: str, answer: str, count: int, temperature: float, top_p: float) -> list[str]:
|
| 707 |
+
if self.tokenizer is None or self.model is None or self.device is None:
|
| 708 |
+
raise RuntimeError("Model chưa được load.")
|
| 709 |
+
|
| 710 |
+
inputs = self.tokenizer(
|
| 711 |
+
self._build_prompt(context, answer),
|
| 712 |
+
return_tensors="pt",
|
| 713 |
+
truncation=True,
|
| 714 |
+
max_length=self.max_source_length,
|
| 715 |
+
).to(self.device)
|
| 716 |
+
outputs = self.model.generate(
|
| 717 |
+
**inputs,
|
| 718 |
+
max_new_tokens=self.max_new_tokens,
|
| 719 |
+
do_sample=True,
|
| 720 |
+
temperature=temperature,
|
| 721 |
+
top_p=top_p,
|
| 722 |
+
num_return_sequences=max(1, min(count, 6)),
|
| 723 |
+
no_repeat_ngram_size=3,
|
| 724 |
+
repetition_penalty=1.1,
|
| 725 |
+
)
|
| 726 |
+
questions: list[str] = []
|
| 727 |
+
for token_ids in outputs:
|
| 728 |
+
question = normalize_text(self.tokenizer.decode(token_ids, skip_special_tokens=True))
|
| 729 |
+
if question:
|
| 730 |
+
questions.append(question if question.endswith("?") else f"{question}?")
|
| 731 |
+
return [question for question in unique_text(questions) if len(question.split()) >= 3]
|
| 732 |
+
|
| 733 |
+
def generate(self, text: str, num_questions: int = 5) -> list[str]:
|
| 734 |
+
clean_text = normalize_text(text)
|
| 735 |
+
requested_count = parse_question_count(num_questions)
|
| 736 |
+
if not clean_text:
|
| 737 |
+
return []
|
| 738 |
+
|
| 739 |
+
self.load()
|
| 740 |
+
answers = self._candidate_answers(clean_text, limit=max(requested_count * 3, 8))
|
| 741 |
+
questions: list[str] = []
|
| 742 |
+
|
| 743 |
+
for temperature, top_p, candidate_step, sample_count in GENERATION_PASSES:
|
| 744 |
+
for index, answer in enumerate(answers):
|
| 745 |
+
generated = self._sample(
|
| 746 |
+
clean_text,
|
| 747 |
+
answer,
|
| 748 |
+
count=min(sample_count + requested_count, requested_count + 2),
|
| 749 |
+
temperature=temperature,
|
| 750 |
+
top_p=top_p,
|
| 751 |
+
)
|
| 752 |
+
questions.extend(generated)
|
| 753 |
+
questions = unique_text(questions)
|
| 754 |
+
if len(questions) >= requested_count:
|
| 755 |
+
return questions[:requested_count]
|
| 756 |
+
if candidate_step and (index + 1) % candidate_step == 0 and len(questions) >= requested_count:
|
| 757 |
+
return questions[:requested_count]
|
| 758 |
+
|
| 759 |
+
return questions[:requested_count]
|
| 760 |
+
|
| 761 |
+
|
| 762 |
+
def _read_text_from_args(args: argparse.Namespace) -> str:
|
| 763 |
+
if args.text:
|
| 764 |
+
return normalize_text(args.text)
|
| 765 |
+
if args.input_file:
|
| 766 |
+
return normalize_text(Path(args.input_file).read_text(encoding="utf-8"))
|
| 767 |
+
raise SystemExit("Vui lòng truyền --text hoặc --input_file.")
|
| 768 |
+
|
| 769 |
+
|
| 770 |
+
def build_parser() -> argparse.ArgumentParser:
|
| 771 |
+
parser = argparse.ArgumentParser(description="Sinh câu hỏi từ một đoạn văn bản bằng model T5 tiếng Việt.")
|
| 772 |
+
parser.add_argument("--text", help="Đoạn văn bản đầu vào.")
|
| 773 |
+
parser.add_argument("--input_file", help="Đọc đoạn văn bản từ file UTF-8.")
|
| 774 |
+
parser.add_argument("--num_questions", type=int, default=5, help="Số câu hỏi cần sinh.")
|
| 775 |
+
parser.add_argument("--model_dir", default=os.getenv("HVU_MODEL_DIR", "t5-viet-qg-finetuned"))
|
| 776 |
+
parser.add_argument("--task_prefix", default=os.getenv("HVU_TASK_PREFIX", TASK_PREFIX))
|
| 777 |
+
parser.add_argument("--device", default=os.getenv("HVU_DEVICE", "auto"), choices=["auto", "cpu", "cuda"])
|
| 778 |
+
parser.add_argument("--cpu_threads", type=int, default=None)
|
| 779 |
+
parser.add_argument("--gpu_dtype", default=os.getenv("HVU_GPU_DTYPE", "auto"))
|
| 780 |
+
parser.add_argument("--max_source_length", type=int, default=int(os.getenv("HVU_MAX_SOURCE_LENGTH", "512")))
|
| 781 |
+
parser.add_argument("--max_new_tokens", type=int, default=int(os.getenv("HVU_MAX_NEW_TOKENS", "64")))
|
| 782 |
+
parser.add_argument("--output_format", choices=["text", "json"], default="text")
|
| 783 |
+
return parser
|
| 784 |
+
|
| 785 |
+
|
| 786 |
+
def main() -> int:
|
| 787 |
+
if hasattr(sys.stdout, "reconfigure"):
|
| 788 |
+
sys.stdout.reconfigure(encoding="utf-8")
|
| 789 |
+
if hasattr(sys.stderr, "reconfigure"):
|
| 790 |
+
sys.stderr.reconfigure(encoding="utf-8")
|
| 791 |
+
|
| 792 |
+
args = build_parser().parse_args()
|
| 793 |
+
text = _read_text_from_args(args)
|
| 794 |
+
generator = QuestionGenerator(
|
| 795 |
+
model_dir=args.model_dir,
|
| 796 |
+
task_prefix=args.task_prefix,
|
| 797 |
+
max_source_length=args.max_source_length,
|
| 798 |
+
max_new_tokens=args.max_new_tokens,
|
| 799 |
+
device=args.device,
|
| 800 |
+
cpu_threads=args.cpu_threads,
|
| 801 |
+
gpu_dtype=args.gpu_dtype,
|
| 802 |
+
)
|
| 803 |
+
questions = generator.generate(text, args.num_questions)
|
| 804 |
+
payload = {
|
| 805 |
+
"ok": True,
|
| 806 |
+
"text": text,
|
| 807 |
+
"num_questions": parse_question_count(args.num_questions),
|
| 808 |
+
"questions": questions,
|
| 809 |
+
"formatted": format_questions(questions),
|
| 810 |
+
"meta": generator.metadata(),
|
| 811 |
+
}
|
| 812 |
+
|
| 813 |
+
if args.output_format == "json":
|
| 814 |
+
print(json.dumps(payload, ensure_ascii=False, indent=2))
|
| 815 |
+
else:
|
| 816 |
+
print(payload["formatted"])
|
| 817 |
+
return 0
|
| 818 |
+
|
| 819 |
+
|
| 820 |
+
if __name__ == "__main__":
|
| 821 |
+
raise SystemExit(main())
|
| 822 |
+
"""
|
| 823 |
+
).strip()
|
| 824 |
+
+ "\n",
|
| 825 |
+
"frontend/index.html": textwrap.dedent(
|
| 826 |
+
"""
|
| 827 |
+
<!doctype html>
|
| 828 |
+
<html lang="vi">
|
| 829 |
+
<head>
|
| 830 |
+
<meta charset="utf-8">
|
| 831 |
+
<meta name="viewport" content="width=device-width, initial-scale=1">
|
| 832 |
+
<title>HVU_QA Tool</title>
|
| 833 |
+
<link rel="stylesheet" href="/frontend/style.css">
|
| 834 |
+
</head>
|
| 835 |
+
<body>
|
| 836 |
+
<div class="page-shell">
|
| 837 |
+
<header class="hero">
|
| 838 |
+
<span class="hero-badge">HVU_QA Tool</span>
|
| 839 |
+
<h1>Sinh câu hỏi từ văn bản</h1>
|
| 840 |
+
<p>Launcher nhẹ dành cho người dùng cuối. Chỉ cần một file tool để dựng runtime, tải model và chạy ứng dụng.</p>
|
| 841 |
+
</header>
|
| 842 |
+
|
| 843 |
+
<div class="layout">
|
| 844 |
+
<aside class="sidebar">
|
| 845 |
+
<section class="panel">
|
| 846 |
+
<div class="panel-heading">
|
| 847 |
+
<h2>Trạng thái model</h2>
|
| 848 |
+
<span id="readyBadge" class="badge badge-soft">Đang tải</span>
|
| 849 |
+
</div>
|
| 850 |
+
|
| 851 |
+
<label class="field-label" for="modelSelect">Model đang dùng</label>
|
| 852 |
+
<select id="modelSelect" class="select-field"></select>
|
| 853 |
+
|
| 854 |
+
<dl class="status-list">
|
| 855 |
+
<div>
|
| 856 |
+
<dt>Tên hiển thị</dt>
|
| 857 |
+
<dd id="modelName">-</dd>
|
| 858 |
+
</div>
|
| 859 |
+
<div>
|
| 860 |
+
<dt>Thiết bị</dt>
|
| 861 |
+
<dd id="deviceStatus">-</dd>
|
| 862 |
+
</div>
|
| 863 |
+
<div>
|
| 864 |
+
<dt>Trạng thái nạp</dt>
|
| 865 |
+
<dd id="loadedStatus">-</dd>
|
| 866 |
+
</div>
|
| 867 |
+
</dl>
|
| 868 |
+
</section>
|
| 869 |
+
|
| 870 |
+
<section class="panel">
|
| 871 |
+
<div class="panel-heading">
|
| 872 |
+
<h2>Ví dụ mẫu</h2>
|
| 873 |
+
</div>
|
| 874 |
+
<p class="panel-hint">Bấm vào một văn bản luật mẫu để chèn nhanh nội dung thử nghiệm.</p>
|
| 875 |
+
<div id="sampleList" class="sample-list"></div>
|
| 876 |
+
</section>
|
| 877 |
+
</aside>
|
| 878 |
+
|
| 879 |
+
<main class="main-panel">
|
| 880 |
+
<section class="composer panel">
|
| 881 |
+
<label class="field-label" for="sourceText">Đoạn văn bản đầu vào</label>
|
| 882 |
+
<textarea id="sourceText" class="text-input" placeholder="Nhập đoạn văn bản ..."></textarea>
|
| 883 |
+
|
| 884 |
+
<div class="composer-footer">
|
| 885 |
+
<div class="count-field">
|
| 886 |
+
<span class="field-label">Số câu hỏi</span>
|
| 887 |
+
<div class="count-controls">
|
| 888 |
+
<button id="decreaseCount" type="button" class="count-button">-</button>
|
| 889 |
+
<input id="questionCount" class="count-input" type="number" min="1" max="100" value="5">
|
| 890 |
+
<button id="increaseCount" type="button" class="count-button">+</button>
|
| 891 |
+
</div>
|
| 892 |
+
</div>
|
| 893 |
+
|
| 894 |
+
<button id="generateButton" type="button" class="primary-button">
|
| 895 |
+
<span id="generateButtonText">Sinh câu hỏi</span>
|
| 896 |
+
</button>
|
| 897 |
+
</div>
|
| 898 |
+
|
| 899 |
+
<p id="formMessage" class="form-message"></p>
|
| 900 |
+
</section>
|
| 901 |
+
|
| 902 |
+
<section id="resultPanel" class="result-panel panel">
|
| 903 |
+
<div id="resultPlaceholder" class="result-placeholder">
|
| 904 |
+
Nhập văn bản và nhấn <strong>Sinh câu hỏi</strong> để xem kết quả.
|
| 905 |
+
</div>
|
| 906 |
+
|
| 907 |
+
<div id="resultContent" class="result-content hidden">
|
| 908 |
+
<div class="result-header">
|
| 909 |
+
<div>
|
| 910 |
+
<h2>Kết quả sinh câu hỏi</h2>
|
| 911 |
+
<p id="resultStats" class="result-stats"></p>
|
| 912 |
+
</div>
|
| 913 |
+
<button id="copyButton" type="button" class="secondary-button">Sao chép</button>
|
| 914 |
+
</div>
|
| 915 |
+
|
| 916 |
+
<ol id="resultList" class="result-list"></ol>
|
| 917 |
+
<pre id="formattedOutput" class="formatted-output"></pre>
|
| 918 |
+
</div>
|
| 919 |
+
</section>
|
| 920 |
+
</main>
|
| 921 |
+
</div>
|
| 922 |
+
</div>
|
| 923 |
+
|
| 924 |
+
<script src="/frontend/app.js"></script>
|
| 925 |
+
</body>
|
| 926 |
+
</html>
|
| 927 |
+
"""
|
| 928 |
+
).strip()
|
| 929 |
+
+ "\n",
|
| 930 |
+
"frontend/app.js": textwrap.dedent(
|
| 931 |
+
"""
|
| 932 |
+
const sampleTexts = [
|
| 933 |
+
{
|
| 934 |
+
title: 'Luật Giáo dục đại học',
|
| 935 |
+
text: 'Cơ sở giáo dục đại học có nhiệm vụ tổ chức đào tạo, nghiên cứu khoa học, chuyển giao công nghệ và phục vụ cộng đồng theo quy định của pháp luật.'
|
| 936 |
+
},
|
| 937 |
+
{
|
| 938 |
+
title: 'Bộ luật Lao động',
|
| 939 |
+
text: 'Người lao động là người làm việc cho người sử dụng lao động theo thỏa thuận, được trả lương và chịu sự quản lý, điều hành, giám sát của người sử dụng lao động.'
|
| 940 |
+
},
|
| 941 |
+
{
|
| 942 |
+
title: 'Luật An toàn thông tin mạng',
|
| 943 |
+
text: 'An toàn thông tin mạng là sự bảo vệ thông tin, hệ thống thông tin trên mạng khỏi bị truy nhập, sử dụng, tiết lộ, gián đoạn, sửa đổi hoặc phá hoại trái phép.'
|
| 944 |
+
}
|
| 945 |
+
];
|
| 946 |
+
|
| 947 |
+
const state = {
|
| 948 |
+
info: null,
|
| 949 |
+
loading: false,
|
| 950 |
+
count: 5,
|
| 951 |
+
lastFormatted: ''
|
| 952 |
+
};
|
| 953 |
+
|
| 954 |
+
const elements = {
|
| 955 |
+
modelSelect: document.getElementById('modelSelect'),
|
| 956 |
+
readyBadge: document.getElementById('readyBadge'),
|
| 957 |
+
modelName: document.getElementById('modelName'),
|
| 958 |
+
deviceStatus: document.getElementById('deviceStatus'),
|
| 959 |
+
loadedStatus: document.getElementById('loadedStatus'),
|
| 960 |
+
sampleList: document.getElementById('sampleList'),
|
| 961 |
+
sourceText: document.getElementById('sourceText'),
|
| 962 |
+
decreaseCount: document.getElementById('decreaseCount'),
|
| 963 |
+
increaseCount: document.getElementById('increaseCount'),
|
| 964 |
+
questionCount: document.getElementById('questionCount'),
|
| 965 |
+
generateButton: document.getElementById('generateButton'),
|
| 966 |
+
generateButtonText: document.getElementById('generateButtonText'),
|
| 967 |
+
formMessage: document.getElementById('formMessage'),
|
| 968 |
+
resultPanel: document.getElementById('resultPanel'),
|
| 969 |
+
resultPlaceholder: document.getElementById('resultPlaceholder'),
|
| 970 |
+
resultContent: document.getElementById('resultContent'),
|
| 971 |
+
resultStats: document.getElementById('resultStats'),
|
| 972 |
+
resultList: document.getElementById('resultList'),
|
| 973 |
+
formattedOutput: document.getElementById('formattedOutput'),
|
| 974 |
+
copyButton: document.getElementById('copyButton')
|
| 975 |
+
};
|
| 976 |
+
|
| 977 |
+
function normalizeCount(value) {
|
| 978 |
+
const parsed = Number.parseInt(value, 10);
|
| 979 |
+
if (Number.isNaN(parsed)) {
|
| 980 |
+
return 1;
|
| 981 |
+
}
|
| 982 |
+
return Math.max(1, Math.min(100, parsed));
|
| 983 |
+
}
|
| 984 |
+
|
| 985 |
+
function setCount(value) {
|
| 986 |
+
state.count = normalizeCount(value);
|
| 987 |
+
elements.questionCount.value = String(state.count);
|
| 988 |
+
}
|
| 989 |
+
|
| 990 |
+
function setMessage(text, tone = 'muted') {
|
| 991 |
+
elements.formMessage.textContent = text || '';
|
| 992 |
+
elements.formMessage.dataset.tone = tone;
|
| 993 |
+
}
|
| 994 |
+
|
| 995 |
+
function setLoading(loading) {
|
| 996 |
+
state.loading = loading;
|
| 997 |
+
elements.generateButton.disabled = loading;
|
| 998 |
+
elements.modelSelect.disabled = loading;
|
| 999 |
+
elements.generateButtonText.textContent = loading ? 'Đang xử lý...' : 'Sinh câu hỏi';
|
| 1000 |
+
elements.readyBadge.textContent = loading ? 'Đang chạy' : 'Sẵn sàng';
|
| 1001 |
+
elements.readyBadge.classList.toggle('badge-busy', loading);
|
| 1002 |
+
}
|
| 1003 |
+
|
| 1004 |
+
async function fetchJson(url, options = {}) {
|
| 1005 |
+
const response = await fetch(url, options);
|
| 1006 |
+
const payload = await response.json().catch(() => ({}));
|
| 1007 |
+
if (!response.ok || payload.ok === false) {
|
| 1008 |
+
throw new Error(payload.error || `Yêu cầu thất bại (${response.status})`);
|
| 1009 |
+
}
|
| 1010 |
+
return payload;
|
| 1011 |
+
}
|
| 1012 |
+
|
| 1013 |
+
function renderSamples() {
|
| 1014 |
+
elements.sampleList.innerHTML = '';
|
| 1015 |
+
sampleTexts.forEach((sample) => {
|
| 1016 |
+
const button = document.createElement('button');
|
| 1017 |
+
button.type = 'button';
|
| 1018 |
+
button.className = 'sample-card';
|
| 1019 |
+
button.innerHTML = `<strong>${sample.title}</strong><span>${sample.text}</span>`;
|
| 1020 |
+
button.addEventListener('click', () => {
|
| 1021 |
+
elements.sourceText.value = sample.text;
|
| 1022 |
+
setMessage(`Đã chèn mẫu: ${sample.title}`, 'muted');
|
| 1023 |
+
elements.sourceText.focus();
|
| 1024 |
+
});
|
| 1025 |
+
elements.sampleList.appendChild(button);
|
| 1026 |
+
});
|
| 1027 |
+
}
|
| 1028 |
+
|
| 1029 |
+
function renderInfo(info) {
|
| 1030 |
+
state.info = info;
|
| 1031 |
+
const models = Array.isArray(info.available_models) ? info.available_models : [];
|
| 1032 |
+
const selectedId = info.selected_model_id || models[0]?.id || '';
|
| 1033 |
+
|
| 1034 |
+
elements.modelSelect.innerHTML = '';
|
| 1035 |
+
if (!models.length) {
|
| 1036 |
+
const option = document.createElement('option');
|
| 1037 |
+
option.value = '';
|
| 1038 |
+
option.textContent = 'Không có model khả dụng';
|
| 1039 |
+
elements.modelSelect.appendChild(option);
|
| 1040 |
+
} else {
|
| 1041 |
+
models.forEach((model) => {
|
| 1042 |
+
const option = document.createElement('option');
|
| 1043 |
+
option.value = model.id;
|
| 1044 |
+
option.textContent = model.label;
|
| 1045 |
+
elements.modelSelect.appendChild(option);
|
| 1046 |
+
});
|
| 1047 |
+
elements.modelSelect.value = selectedId;
|
| 1048 |
+
}
|
| 1049 |
+
|
| 1050 |
+
const meta = info.meta || {};
|
| 1051 |
+
elements.modelName.textContent = info.model_name || '-';
|
| 1052 |
+
elements.deviceStatus.textContent = meta.active_device
|
| 1053 |
+
? meta.active_device.toUpperCase()
|
| 1054 |
+
: (meta.predicted_device ? `Dự đoán: ${String(meta.predicted_device).toUpperCase()}` : '-');
|
| 1055 |
+
elements.loadedStatus.textContent = meta.loaded ? 'Đã nạp' : 'Chưa nạp';
|
| 1056 |
+
elements.readyBadge.textContent = 'Sẵn sàng';
|
| 1057 |
+
elements.readyBadge.classList.remove('badge-busy');
|
| 1058 |
+
}
|
| 1059 |
+
|
| 1060 |
+
function renderResult(result) {
|
| 1061 |
+
const questions = Array.isArray(result.questions) ? result.questions : [];
|
| 1062 |
+
elements.resultPlaceholder.classList.add('hidden');
|
| 1063 |
+
elements.resultContent.classList.remove('hidden');
|
| 1064 |
+
elements.resultList.innerHTML = '';
|
| 1065 |
+
|
| 1066 |
+
questions.forEach((question) => {
|
| 1067 |
+
const item = document.createElement('li');
|
| 1068 |
+
item.textContent = question;
|
| 1069 |
+
elements.resultList.appendChild(item);
|
| 1070 |
+
});
|
| 1071 |
+
|
| 1072 |
+
state.lastFormatted = result.formatted || '';
|
| 1073 |
+
elements.formattedOutput.textContent = state.lastFormatted;
|
| 1074 |
+
elements.resultStats.textContent = `${questions.length} câu hỏi • ${result.model_name || 'Không rõ model'} • ${result.elapsed_ms || 0} ms`;
|
| 1075 |
+
}
|
| 1076 |
+
|
| 1077 |
+
async function loadInfo() {
|
| 1078 |
+
const info = await fetchJson('/api/info');
|
| 1079 |
+
renderInfo(info);
|
| 1080 |
+
setMessage('Sẵn sàng để sinh câu hỏi.', 'muted');
|
| 1081 |
+
}
|
| 1082 |
+
|
| 1083 |
+
async function changeModel() {
|
| 1084 |
+
const modelId = elements.modelSelect.value;
|
| 1085 |
+
if (!modelId) {
|
| 1086 |
+
return;
|
| 1087 |
+
}
|
| 1088 |
+
setLoading(true);
|
| 1089 |
+
setMessage('Đang chuyển model...', 'muted');
|
| 1090 |
+
try {
|
| 1091 |
+
const info = await fetchJson('/api/model', {
|
| 1092 |
+
method: 'POST',
|
| 1093 |
+
headers: { 'Content-Type': 'application/json' },
|
| 1094 |
+
body: JSON.stringify({ model_id: modelId })
|
| 1095 |
+
});
|
| 1096 |
+
renderInfo(info);
|
| 1097 |
+
setMessage(`Đã chuyển sang model: ${info.model_name}`, 'muted');
|
| 1098 |
+
} catch (error) {
|
| 1099 |
+
setMessage(error.message, 'error');
|
| 1100 |
+
} finally {
|
| 1101 |
+
setLoading(false);
|
| 1102 |
+
}
|
| 1103 |
+
}
|
| 1104 |
+
|
| 1105 |
+
async function generateQuestions() {
|
| 1106 |
+
const text = elements.sourceText.value.trim();
|
| 1107 |
+
if (!text) {
|
| 1108 |
+
setMessage('Vui lòng nhập đoạn văn bản trước khi sinh câu hỏi.', 'error');
|
| 1109 |
+
elements.sourceText.focus();
|
| 1110 |
+
return;
|
| 1111 |
+
}
|
| 1112 |
+
|
| 1113 |
+
setLoading(true);
|
| 1114 |
+
setMessage('Đang sinh câu hỏi từ nội dung đã nhập...', 'muted');
|
| 1115 |
+
|
| 1116 |
+
try {
|
| 1117 |
+
const payload = await fetchJson('/api/generate', {
|
| 1118 |
+
method: 'POST',
|
| 1119 |
+
headers: { 'Content-Type': 'application/json' },
|
| 1120 |
+
body: JSON.stringify({
|
| 1121 |
+
text,
|
| 1122 |
+
num_questions: state.count,
|
| 1123 |
+
model_id: elements.modelSelect.value || undefined
|
| 1124 |
+
})
|
| 1125 |
+
});
|
| 1126 |
+
renderResult(payload);
|
| 1127 |
+
setMessage(`Đã sinh xong ${payload.questions.length} câu hỏi.`, 'muted');
|
| 1128 |
+
} catch (error) {
|
| 1129 |
+
setMessage(error.message, 'error');
|
| 1130 |
+
} finally {
|
| 1131 |
+
setLoading(false);
|
| 1132 |
+
}
|
| 1133 |
+
}
|
| 1134 |
+
|
| 1135 |
+
async function copyOutput() {
|
| 1136 |
+
if (!state.lastFormatted) {
|
| 1137 |
+
setMessage('Chưa có nội dung để sao chép.', 'error');
|
| 1138 |
+
return;
|
| 1139 |
+
}
|
| 1140 |
+
|
| 1141 |
+
try {
|
| 1142 |
+
await navigator.clipboard.writeText(state.lastFormatted);
|
| 1143 |
+
setMessage('Đã sao chép kết quả vào clipboard.', 'muted');
|
| 1144 |
+
} catch (error) {
|
| 1145 |
+
setMessage('Không thể sao chép tự động. Hãy sao chép thủ công.', 'error');
|
| 1146 |
+
}
|
| 1147 |
+
}
|
| 1148 |
+
|
| 1149 |
+
function bindEvents() {
|
| 1150 |
+
elements.decreaseCount.addEventListener('click', () => setCount(state.count - 1));
|
| 1151 |
+
elements.increaseCount.addEventListener('click', () => setCount(state.count + 1));
|
| 1152 |
+
elements.questionCount.addEventListener('change', (event) => setCount(event.target.value));
|
| 1153 |
+
elements.modelSelect.addEventListener('change', changeModel);
|
| 1154 |
+
elements.generateButton.addEventListener('click', generateQuestions);
|
| 1155 |
+
elements.copyButton.addEventListener('click', copyOutput);
|
| 1156 |
+
}
|
| 1157 |
+
|
| 1158 |
+
async function init() {
|
| 1159 |
+
renderSamples();
|
| 1160 |
+
setCount(5);
|
| 1161 |
+
bindEvents();
|
| 1162 |
+
try {
|
| 1163 |
+
await loadInfo();
|
| 1164 |
+
} catch (error) {
|
| 1165 |
+
setMessage(error.message || 'Không thể kết nối backend.', 'error');
|
| 1166 |
+
elements.readyBadge.textContent = 'Lỗi';
|
| 1167 |
+
}
|
| 1168 |
+
}
|
| 1169 |
+
|
| 1170 |
+
document.addEventListener('DOMContentLoaded', init);
|
| 1171 |
+
"""
|
| 1172 |
+
).strip()
|
| 1173 |
+
+ "\n",
|
| 1174 |
+
"frontend/style.css": textwrap.dedent(
|
| 1175 |
+
"""
|
| 1176 |
+
:root {
|
| 1177 |
+
--bg-start: #f8f5ff;
|
| 1178 |
+
--bg-end: #eef4ff;
|
| 1179 |
+
--panel: rgba(255, 255, 255, 0.82);
|
| 1180 |
+
--border: rgba(103, 102, 181, 0.18);
|
| 1181 |
+
--text: #23244d;
|
| 1182 |
+
--muted: #6c6d9a;
|
| 1183 |
+
--primary-start: #6b73ff;
|
| 1184 |
+
--primary-end: #d96ba2;
|
| 1185 |
+
--shadow: 0 22px 60px rgba(52, 56, 121, 0.14);
|
| 1186 |
+
}
|
| 1187 |
+
|
| 1188 |
+
* {
|
| 1189 |
+
box-sizing: border-box;
|
| 1190 |
+
}
|
| 1191 |
+
|
| 1192 |
+
body {
|
| 1193 |
+
margin: 0;
|
| 1194 |
+
min-height: 100vh;
|
| 1195 |
+
font-family: "Be Vietnam Pro", "Segoe UI", sans-serif;
|
| 1196 |
+
color: var(--text);
|
| 1197 |
+
background:
|
| 1198 |
+
radial-gradient(circle at top left, rgba(123, 135, 255, 0.14), transparent 28%),
|
| 1199 |
+
radial-gradient(circle at bottom right, rgba(217, 107, 162, 0.18), transparent 25%),
|
| 1200 |
+
linear-gradient(135deg, var(--bg-start), var(--bg-end));
|
| 1201 |
+
}
|
| 1202 |
+
|
| 1203 |
+
button,
|
| 1204 |
+
input,
|
| 1205 |
+
textarea,
|
| 1206 |
+
select {
|
| 1207 |
+
font: inherit;
|
| 1208 |
+
}
|
| 1209 |
+
|
| 1210 |
+
.page-shell {
|
| 1211 |
+
width: min(1200px, calc(100% - 32px));
|
| 1212 |
+
margin: 24px auto;
|
| 1213 |
+
}
|
| 1214 |
+
|
| 1215 |
+
.hero {
|
| 1216 |
+
padding: 32px;
|
| 1217 |
+
border: 1px solid var(--border);
|
| 1218 |
+
border-radius: 28px;
|
| 1219 |
+
background: var(--panel);
|
| 1220 |
+
box-shadow: var(--shadow);
|
| 1221 |
+
backdrop-filter: blur(18px);
|
| 1222 |
+
}
|
| 1223 |
+
|
| 1224 |
+
.hero-badge {
|
| 1225 |
+
display: inline-flex;
|
| 1226 |
+
padding: 8px 14px;
|
| 1227 |
+
border-radius: 999px;
|
| 1228 |
+
background: rgba(107, 115, 255, 0.12);
|
| 1229 |
+
color: #5058d9;
|
| 1230 |
+
font-size: 13px;
|
| 1231 |
+
font-weight: 700;
|
| 1232 |
+
letter-spacing: 0.04em;
|
| 1233 |
+
text-transform: uppercase;
|
| 1234 |
+
}
|
| 1235 |
+
|
| 1236 |
+
.hero h1 {
|
| 1237 |
+
margin: 18px 0 10px;
|
| 1238 |
+
font-size: clamp(34px, 5vw, 56px);
|
| 1239 |
+
line-height: 1.04;
|
| 1240 |
+
}
|
| 1241 |
+
|
| 1242 |
+
.hero p {
|
| 1243 |
+
margin: 0;
|
| 1244 |
+
max-width: 760px;
|
| 1245 |
+
color: var(--muted);
|
| 1246 |
+
font-size: 18px;
|
| 1247 |
+
line-height: 1.65;
|
| 1248 |
+
}
|
| 1249 |
+
|
| 1250 |
+
.layout {
|
| 1251 |
+
display: grid;
|
| 1252 |
+
grid-template-columns: 320px minmax(0, 1fr);
|
| 1253 |
+
gap: 20px;
|
| 1254 |
+
margin-top: 20px;
|
| 1255 |
+
}
|
| 1256 |
+
|
| 1257 |
+
.panel {
|
| 1258 |
+
border: 1px solid var(--border);
|
| 1259 |
+
border-radius: 24px;
|
| 1260 |
+
background: var(--panel);
|
| 1261 |
+
box-shadow: var(--shadow);
|
| 1262 |
+
backdrop-filter: blur(18px);
|
| 1263 |
+
}
|
| 1264 |
+
|
| 1265 |
+
.sidebar,
|
| 1266 |
+
.main-panel {
|
| 1267 |
+
display: grid;
|
| 1268 |
+
gap: 20px;
|
| 1269 |
+
align-content: start;
|
| 1270 |
+
}
|
| 1271 |
+
|
| 1272 |
+
.panel-heading {
|
| 1273 |
+
display: flex;
|
| 1274 |
+
align-items: center;
|
| 1275 |
+
justify-content: space-between;
|
| 1276 |
+
gap: 12px;
|
| 1277 |
+
margin-bottom: 16px;
|
| 1278 |
+
}
|
| 1279 |
+
|
| 1280 |
+
.panel h2 {
|
| 1281 |
+
margin: 0;
|
| 1282 |
+
font-size: 18px;
|
| 1283 |
+
}
|
| 1284 |
+
|
| 1285 |
+
.sidebar .panel,
|
| 1286 |
+
.composer,
|
| 1287 |
+
.result-panel {
|
| 1288 |
+
padding: 22px;
|
| 1289 |
+
}
|
| 1290 |
+
|
| 1291 |
+
.badge {
|
| 1292 |
+
display: inline-flex;
|
| 1293 |
+
align-items: center;
|
| 1294 |
+
justify-content: center;
|
| 1295 |
+
min-width: 92px;
|
| 1296 |
+
padding: 8px 12px;
|
| 1297 |
+
border-radius: 999px;
|
| 1298 |
+
font-size: 13px;
|
| 1299 |
+
font-weight: 700;
|
| 1300 |
+
}
|
| 1301 |
+
|
| 1302 |
+
.badge-soft {
|
| 1303 |
+
background: rgba(39, 179, 112, 0.14);
|
| 1304 |
+
color: #218b59;
|
| 1305 |
+
}
|
| 1306 |
+
|
| 1307 |
+
.badge-busy {
|
| 1308 |
+
background: rgba(238, 160, 59, 0.16);
|
| 1309 |
+
color: #b86a00;
|
| 1310 |
+
}
|
| 1311 |
+
|
| 1312 |
+
.field-label {
|
| 1313 |
+
display: inline-block;
|
| 1314 |
+
margin-bottom: 10px;
|
| 1315 |
+
color: var(--muted);
|
| 1316 |
+
font-size: 13px;
|
| 1317 |
+
font-weight: 700;
|
| 1318 |
+
letter-spacing: 0.02em;
|
| 1319 |
+
}
|
| 1320 |
+
|
| 1321 |
+
.select-field,
|
| 1322 |
+
.text-input,
|
| 1323 |
+
.count-input {
|
| 1324 |
+
width: 100%;
|
| 1325 |
+
border: 1px solid rgba(103, 102, 181, 0.14);
|
| 1326 |
+
border-radius: 18px;
|
| 1327 |
+
background: rgba(255, 255, 255, 0.92);
|
| 1328 |
+
color: var(--text);
|
| 1329 |
+
}
|
| 1330 |
+
|
| 1331 |
+
.select-field {
|
| 1332 |
+
min-height: 52px;
|
| 1333 |
+
padding: 0 16px;
|
| 1334 |
+
}
|
| 1335 |
+
|
| 1336 |
+
.status-list {
|
| 1337 |
+
display: grid;
|
| 1338 |
+
gap: 14px;
|
| 1339 |
+
margin: 18px 0 0;
|
| 1340 |
+
}
|
| 1341 |
+
|
| 1342 |
+
.status-list div {
|
| 1343 |
+
padding: 14px 16px;
|
| 1344 |
+
border-radius: 18px;
|
| 1345 |
+
background: rgba(104, 109, 208, 0.07);
|
| 1346 |
+
}
|
| 1347 |
+
|
| 1348 |
+
.status-list dt {
|
| 1349 |
+
margin: 0 0 6px;
|
| 1350 |
+
color: var(--muted);
|
| 1351 |
+
font-size: 12px;
|
| 1352 |
+
font-weight: 700;
|
| 1353 |
+
text-transform: uppercase;
|
| 1354 |
+
letter-spacing: 0.04em;
|
| 1355 |
+
}
|
| 1356 |
+
|
| 1357 |
+
.status-list dd {
|
| 1358 |
+
margin: 0;
|
| 1359 |
+
font-size: 15px;
|
| 1360 |
+
font-weight: 600;
|
| 1361 |
+
word-break: break-word;
|
| 1362 |
+
}
|
| 1363 |
+
|
| 1364 |
+
.panel-hint {
|
| 1365 |
+
margin: 0 0 14px;
|
| 1366 |
+
color: var(--muted);
|
| 1367 |
+
line-height: 1.6;
|
| 1368 |
+
}
|
| 1369 |
+
|
| 1370 |
+
.sample-list {
|
| 1371 |
+
display: grid;
|
| 1372 |
+
gap: 12px;
|
| 1373 |
+
}
|
| 1374 |
+
|
| 1375 |
+
.sample-card {
|
| 1376 |
+
display: grid;
|
| 1377 |
+
gap: 8px;
|
| 1378 |
+
width: 100%;
|
| 1379 |
+
padding: 16px;
|
| 1380 |
+
border: 1px solid rgba(103, 102, 181, 0.14);
|
| 1381 |
+
border-radius: 18px;
|
| 1382 |
+
background: rgba(255, 255, 255, 0.92);
|
| 1383 |
+
text-align: left;
|
| 1384 |
+
color: var(--text);
|
| 1385 |
+
cursor: pointer;
|
| 1386 |
+
transition: transform 0.18s ease, border-color 0.18s ease, box-shadow 0.18s ease;
|
| 1387 |
+
}
|
| 1388 |
+
|
| 1389 |
+
.sample-card:hover {
|
| 1390 |
+
transform: translateY(-2px);
|
| 1391 |
+
border-color: rgba(86, 98, 218, 0.32);
|
| 1392 |
+
box-shadow: 0 16px 30px rgba(61, 70, 154, 0.12);
|
| 1393 |
+
}
|
| 1394 |
+
|
| 1395 |
+
.sample-card span {
|
| 1396 |
+
color: var(--muted);
|
| 1397 |
+
line-height: 1.55;
|
| 1398 |
+
}
|
| 1399 |
+
|
| 1400 |
+
.text-input {
|
| 1401 |
+
min-height: 250px;
|
| 1402 |
+
padding: 18px 20px;
|
| 1403 |
+
resize: vertical;
|
| 1404 |
+
line-height: 1.7;
|
| 1405 |
+
}
|
| 1406 |
+
|
| 1407 |
+
.composer-footer {
|
| 1408 |
+
display: flex;
|
| 1409 |
+
align-items: end;
|
| 1410 |
+
justify-content: space-between;
|
| 1411 |
+
gap: 18px;
|
| 1412 |
+
margin-top: 18px;
|
| 1413 |
+
}
|
| 1414 |
+
|
| 1415 |
+
.count-field {
|
| 1416 |
+
min-width: 230px;
|
| 1417 |
+
}
|
| 1418 |
+
|
| 1419 |
+
.count-controls {
|
| 1420 |
+
display: grid;
|
| 1421 |
+
grid-template-columns: 48px 92px 48px;
|
| 1422 |
+
gap: 10px;
|
| 1423 |
+
align-items: center;
|
| 1424 |
+
}
|
| 1425 |
+
|
| 1426 |
+
.count-button,
|
| 1427 |
+
.secondary-button {
|
| 1428 |
+
min-height: 48px;
|
| 1429 |
+
border: 1px solid rgba(103, 102, 181, 0.16);
|
| 1430 |
+
border-radius: 16px;
|
| 1431 |
+
background: rgba(255, 255, 255, 0.92);
|
| 1432 |
+
color: var(--text);
|
| 1433 |
+
cursor: pointer;
|
| 1434 |
+
}
|
| 1435 |
+
|
| 1436 |
+
.count-button {
|
| 1437 |
+
font-size: 22px;
|
| 1438 |
+
font-weight: 700;
|
| 1439 |
+
}
|
| 1440 |
+
|
| 1441 |
+
.count-input {
|
| 1442 |
+
min-height: 48px;
|
| 1443 |
+
padding: 0 12px;
|
| 1444 |
+
text-align: center;
|
| 1445 |
+
font-weight: 700;
|
| 1446 |
+
}
|
| 1447 |
+
|
| 1448 |
+
.primary-button {
|
| 1449 |
+
min-width: 220px;
|
| 1450 |
+
min-height: 56px;
|
| 1451 |
+
padding: 0 24px;
|
| 1452 |
+
border: none;
|
| 1453 |
+
border-radius: 18px;
|
| 1454 |
+
background: linear-gradient(135deg, var(--primary-start), var(--primary-end));
|
| 1455 |
+
color: white;
|
| 1456 |
+
font-size: 16px;
|
| 1457 |
+
font-weight: 800;
|
| 1458 |
+
cursor: pointer;
|
| 1459 |
+
box-shadow: 0 18px 34px rgba(95, 105, 220, 0.24);
|
| 1460 |
+
}
|
| 1461 |
+
|
| 1462 |
+
.primary-button:disabled,
|
| 1463 |
+
.secondary-button:disabled {
|
| 1464 |
+
cursor: not-allowed;
|
| 1465 |
+
opacity: 0.7;
|
| 1466 |
+
}
|
| 1467 |
+
|
| 1468 |
+
.form-message {
|
| 1469 |
+
min-height: 22px;
|
| 1470 |
+
margin: 14px 0 0;
|
| 1471 |
+
color: var(--muted);
|
| 1472 |
+
}
|
| 1473 |
+
|
| 1474 |
+
.form-message[data-tone="error"] {
|
| 1475 |
+
color: #c33b5f;
|
| 1476 |
+
}
|
| 1477 |
+
|
| 1478 |
+
.result-panel {
|
| 1479 |
+
min-height: 320px;
|
| 1480 |
+
}
|
| 1481 |
+
|
| 1482 |
+
.result-placeholder {
|
| 1483 |
+
display: grid;
|
| 1484 |
+
place-items: center;
|
| 1485 |
+
min-height: 260px;
|
| 1486 |
+
padding: 24px;
|
| 1487 |
+
border: 1px dashed rgba(103, 102, 181, 0.24);
|
| 1488 |
+
border-radius: 20px;
|
| 1489 |
+
color: var(--muted);
|
| 1490 |
+
text-align: center;
|
| 1491 |
+
line-height: 1.7;
|
| 1492 |
+
}
|
| 1493 |
+
|
| 1494 |
+
.result-content.hidden,
|
| 1495 |
+
.result-placeholder.hidden {
|
| 1496 |
+
display: none;
|
| 1497 |
+
}
|
| 1498 |
+
|
| 1499 |
+
.result-header {
|
| 1500 |
+
display: flex;
|
| 1501 |
+
align-items: start;
|
| 1502 |
+
justify-content: space-between;
|
| 1503 |
+
gap: 16px;
|
| 1504 |
+
margin-bottom: 18px;
|
| 1505 |
+
}
|
| 1506 |
+
|
| 1507 |
+
.result-header h2 {
|
| 1508 |
+
margin: 0 0 8px;
|
| 1509 |
+
}
|
| 1510 |
+
|
| 1511 |
+
.result-stats {
|
| 1512 |
+
margin: 0;
|
| 1513 |
+
color: var(--muted);
|
| 1514 |
+
}
|
| 1515 |
+
|
| 1516 |
+
.result-list {
|
| 1517 |
+
margin: 0;
|
| 1518 |
+
padding-left: 20px;
|
| 1519 |
+
display: grid;
|
| 1520 |
+
gap: 12px;
|
| 1521 |
+
line-height: 1.65;
|
| 1522 |
+
}
|
| 1523 |
+
|
| 1524 |
+
.formatted-output {
|
| 1525 |
+
margin: 20px 0 0;
|
| 1526 |
+
padding: 18px;
|
| 1527 |
+
border-radius: 18px;
|
| 1528 |
+
background: rgba(104, 109, 208, 0.07);
|
| 1529 |
+
white-space: pre-wrap;
|
| 1530 |
+
word-break: break-word;
|
| 1531 |
+
line-height: 1.65;
|
| 1532 |
+
}
|
| 1533 |
+
|
| 1534 |
+
@media (max-width: 980px) {
|
| 1535 |
+
.layout {
|
| 1536 |
+
grid-template-columns: 1fr;
|
| 1537 |
+
}
|
| 1538 |
+
}
|
| 1539 |
+
|
| 1540 |
+
@media (max-width: 640px) {
|
| 1541 |
+
.page-shell {
|
| 1542 |
+
width: min(100% - 16px, 1000px);
|
| 1543 |
+
margin: 16px auto;
|
| 1544 |
+
}
|
| 1545 |
+
|
| 1546 |
+
.hero,
|
| 1547 |
+
.sidebar .panel,
|
| 1548 |
+
.composer,
|
| 1549 |
+
.result-panel {
|
| 1550 |
+
padding: 18px;
|
| 1551 |
+
}
|
| 1552 |
+
|
| 1553 |
+
.composer-footer,
|
| 1554 |
+
.result-header {
|
| 1555 |
+
flex-direction: column;
|
| 1556 |
+
align-items: stretch;
|
| 1557 |
+
}
|
| 1558 |
+
|
| 1559 |
+
.count-field,
|
| 1560 |
+
.primary-button,
|
| 1561 |
+
.secondary-button {
|
| 1562 |
+
width: 100%;
|
| 1563 |
+
}
|
| 1564 |
+
}
|
| 1565 |
+
"""
|
| 1566 |
+
).strip()
|
| 1567 |
+
+ "\n",
|
| 1568 |
+
}
|
| 1569 |
+
|
| 1570 |
+
|
| 1571 |
+
def sync_text_file(destination_file: Path, content: str, force_write: bool) -> bool:
|
| 1572 |
+
destination_file.parent.mkdir(parents=True, exist_ok=True)
|
| 1573 |
+
if destination_file.exists() and not force_write:
|
| 1574 |
+
current = destination_file.read_text(encoding="utf-8")
|
| 1575 |
+
if current == content:
|
| 1576 |
+
return False
|
| 1577 |
+
destination_file.write_text(content, encoding="utf-8")
|
| 1578 |
+
return True
|
| 1579 |
+
|
| 1580 |
+
|
| 1581 |
+
def materialize_standalone_runtime(runtime_root: Path, force_refresh: bool) -> None:
|
| 1582 |
+
runtime_files = build_runtime_file_map()
|
| 1583 |
+
created = 0
|
| 1584 |
+
reused = 0
|
| 1585 |
+
|
| 1586 |
+
for relative_path, content in runtime_files.items():
|
| 1587 |
+
destination = runtime_root / relative_path
|
| 1588 |
+
if sync_text_file(destination, content, force_write=force_refresh):
|
| 1589 |
+
created += 1
|
| 1590 |
+
else:
|
| 1591 |
+
reused += 1
|
| 1592 |
+
|
| 1593 |
+
print_step(
|
| 1594 |
+
f"Đã chuẩn bị runtime standalone tại {runtime_root}. "
|
| 1595 |
+
f"File mới/cập nhật: {created}, file giữ nguyên: {reused}."
|
| 1596 |
+
)
|
| 1597 |
+
|
| 1598 |
+
|
| 1599 |
+
def resolve_runtime_context(args: argparse.Namespace) -> RuntimeContext:
|
| 1600 |
+
use_local_project = has_local_project(SCRIPT_ROOT) and not args.force_standalone_runtime
|
| 1601 |
+
if use_local_project:
|
| 1602 |
+
runtime_root = SCRIPT_ROOT
|
| 1603 |
+
standalone_mode = False
|
| 1604 |
+
else:
|
| 1605 |
+
requested_runtime_dir = Path(args.runtime_dir).expanduser()
|
| 1606 |
+
if not requested_runtime_dir.is_absolute():
|
| 1607 |
+
requested_runtime_dir = SCRIPT_ROOT / requested_runtime_dir
|
| 1608 |
+
runtime_root = requested_runtime_dir.resolve()
|
| 1609 |
+
standalone_mode = True
|
| 1610 |
+
materialize_standalone_runtime(runtime_root, force_refresh=args.force_runtime_refresh)
|
| 1611 |
+
|
| 1612 |
+
context = RuntimeContext(
|
| 1613 |
+
root=runtime_root,
|
| 1614 |
+
main_file=runtime_root / "main.py",
|
| 1615 |
+
requirements_file=runtime_root / "requirements.txt",
|
| 1616 |
+
local_model_dir=runtime_root / "t5-viet-qg-finetuned",
|
| 1617 |
+
local_best_model_dir=runtime_root / "t5-viet-qg-finetuned" / "best-model",
|
| 1618 |
+
standalone_mode=standalone_mode,
|
| 1619 |
+
)
|
| 1620 |
+
mode_label = "standalone" if standalone_mode else "full project"
|
| 1621 |
+
print_step(f"Runtime mode: {mode_label}")
|
| 1622 |
+
print_step(f"Runtime root: {context.root}")
|
| 1623 |
+
return context
|
| 1624 |
+
|
| 1625 |
+
|
| 1626 |
+
def maybe_bootstrap_tool_venv(args: argparse.Namespace) -> int | None:
|
| 1627 |
+
if args.no_venv or is_running_in_virtualenv():
|
| 1628 |
+
return None
|
| 1629 |
+
|
| 1630 |
+
if not TOOL_VENV_PYTHON.exists():
|
| 1631 |
+
print_step("Không phát hiện virtualenv hiện tại. Đang tạo môi trường riêng cho launcher...")
|
| 1632 |
+
run_command([sys.executable, "-m", "venv", str(TOOL_VENV_DIR)], cwd=SCRIPT_ROOT)
|
| 1633 |
+
run_command([str(TOOL_VENV_PYTHON), "-m", "pip", "install", "--upgrade", "pip"], cwd=SCRIPT_ROOT)
|
| 1634 |
+
|
| 1635 |
+
relaunch_env = os.environ.copy()
|
| 1636 |
+
relaunch_env["HVU_QA_TOOL_BOOTSTRAPPED"] = "1"
|
| 1637 |
+
relaunch_command = [str(TOOL_VENV_PYTHON), str(Path(__file__).resolve()), *sys.argv[1:]]
|
| 1638 |
+
|
| 1639 |
+
print_step("Đang chuyển sang môi trường Python riêng của launcher...")
|
| 1640 |
+
return subprocess.call(relaunch_command, cwd=str(SCRIPT_ROOT), env=relaunch_env)
|
| 1641 |
+
|
| 1642 |
+
|
| 1643 |
+
def ensure_huggingface_hub(skip_install: bool, context: RuntimeContext) -> None:
|
| 1644 |
+
if module_exists("huggingface_hub"):
|
| 1645 |
+
return
|
| 1646 |
+
|
| 1647 |
+
if skip_install:
|
| 1648 |
+
install_hint = (
|
| 1649 |
+
f"{sys.executable} -m pip install {HF_HUB_REQUIREMENT}"
|
| 1650 |
+
if not context.requirements_file.exists()
|
| 1651 |
+
else f"{sys.executable} -m pip install -r {context.requirements_file}"
|
| 1652 |
+
)
|
| 1653 |
+
raise RuntimeError(
|
| 1654 |
+
"Thiếu huggingface_hub. Hãy chạy "
|
| 1655 |
+
f"`{install_hint}` hoặc bỏ `--skip-install`."
|
| 1656 |
+
)
|
| 1657 |
+
|
| 1658 |
+
print_step("Thiếu huggingface_hub. Đang cài tự động...")
|
| 1659 |
+
if context.requirements_file.exists():
|
| 1660 |
+
run_command([sys.executable, "-m", "pip", "install", "-r", str(context.requirements_file)], cwd=context.root)
|
| 1661 |
+
else:
|
| 1662 |
+
run_command([sys.executable, "-m", "pip", "install", HF_HUB_REQUIREMENT], cwd=context.root)
|
| 1663 |
+
|
| 1664 |
+
|
| 1665 |
+
def find_missing_dependencies() -> list[str]:
|
| 1666 |
+
missing: list[str] = []
|
| 1667 |
+
for package_name, module_name in DEPENDENCY_IMPORTS.items():
|
| 1668 |
+
if not module_exists(module_name):
|
| 1669 |
+
missing.append(package_name)
|
| 1670 |
+
return missing
|
| 1671 |
+
|
| 1672 |
+
|
| 1673 |
+
def ensure_runtime_dependencies(skip_install: bool, context: RuntimeContext) -> None:
|
| 1674 |
+
missing = find_missing_dependencies()
|
| 1675 |
+
if not missing:
|
| 1676 |
+
print_step("Môi trường Python đã có đủ dependency cần thiết.")
|
| 1677 |
+
return
|
| 1678 |
+
|
| 1679 |
+
if skip_install:
|
| 1680 |
+
missing_text = ", ".join(missing)
|
| 1681 |
+
install_hint = (
|
| 1682 |
+
f"{sys.executable} -m pip install -r {context.requirements_file}"
|
| 1683 |
+
if context.requirements_file.exists()
|
| 1684 |
+
else f"{sys.executable} -m pip install {' '.join(RUNTIME_REQUIREMENTS)}"
|
| 1685 |
+
)
|
| 1686 |
+
raise RuntimeError(
|
| 1687 |
+
f"Thiếu dependency: {missing_text}. "
|
| 1688 |
+
f"Hãy chạy `{install_hint}` hoặc bỏ `--skip-install`."
|
| 1689 |
+
)
|
| 1690 |
+
|
| 1691 |
+
if context.requirements_file.exists():
|
| 1692 |
+
print_step(f"Đang cài dependency còn thiếu: {', '.join(missing)}")
|
| 1693 |
+
run_command([sys.executable, "-m", "pip", "install", "-r", str(context.requirements_file)], cwd=context.root)
|
| 1694 |
+
return
|
| 1695 |
+
|
| 1696 |
+
print_step(f"Đang cài dependency runtime còn thiếu: {', '.join(missing)}")
|
| 1697 |
+
run_command([sys.executable, "-m", "pip", "install", *RUNTIME_REQUIREMENTS], cwd=context.root)
|
| 1698 |
+
|
| 1699 |
+
|
| 1700 |
+
def select_repo_files(repo_files: list[str], best_model_only: bool) -> list[str]:
|
| 1701 |
+
allow_patterns = build_allow_patterns(best_model_only)
|
| 1702 |
+
selected: list[str] = []
|
| 1703 |
+
|
| 1704 |
+
for repo_file in repo_files:
|
| 1705 |
+
normalized = repo_file.replace("\\", "/")
|
| 1706 |
+
if not matches_any_pattern(normalized, allow_patterns):
|
| 1707 |
+
continue
|
| 1708 |
+
if matches_any_pattern(normalized, MODEL_IGNORE_PATTERNS):
|
| 1709 |
+
continue
|
| 1710 |
+
selected.append(normalized)
|
| 1711 |
+
|
| 1712 |
+
return sorted(selected)
|
| 1713 |
+
|
| 1714 |
+
|
| 1715 |
+
def get_target_destination(context: RuntimeContext, repo_file: str) -> Path:
|
| 1716 |
+
relative_path = Path(repo_file).relative_to(HF_MODEL_SUBDIR)
|
| 1717 |
+
return context.local_model_dir / relative_path
|
| 1718 |
+
|
| 1719 |
+
|
| 1720 |
+
def resolve_repo_files(repo_id: str, revision: str, best_model_only: bool) -> list[dict[str, int | str | None]]:
|
| 1721 |
+
from huggingface_hub import HfApi
|
| 1722 |
+
|
| 1723 |
+
api = HfApi()
|
| 1724 |
+
repo_files = api.list_repo_tree(repo_id=repo_id, repo_type="dataset", revision=revision, recursive=True)
|
| 1725 |
+
|
| 1726 |
+
file_entries: list[str] = []
|
| 1727 |
+
size_map: dict[str, int | None] = {}
|
| 1728 |
+
for entry in repo_files:
|
| 1729 |
+
entry_path = str(getattr(entry, "path", "")).replace("\\", "/")
|
| 1730 |
+
if not entry_path or entry_path.endswith("/"):
|
| 1731 |
+
continue
|
| 1732 |
+
file_entries.append(entry_path)
|
| 1733 |
+
size_map[entry_path] = getattr(entry, "size", None)
|
| 1734 |
+
|
| 1735 |
+
selected_paths = select_repo_files(file_entries, best_model_only=best_model_only)
|
| 1736 |
+
if not selected_paths:
|
| 1737 |
+
scope = "best-model" if best_model_only else "model"
|
| 1738 |
+
raise FileNotFoundError(
|
| 1739 |
+
f"Không tìm thấy file {scope} hợp lệ trong repo {repo_id}@{revision}. "
|
| 1740 |
+
"Hãy kiểm tra lại cấu trúc repo trên Hugging Face."
|
| 1741 |
+
)
|
| 1742 |
+
|
| 1743 |
+
return [{"path": path, "size": size_map.get(path)} for path in selected_paths]
|
| 1744 |
+
|
| 1745 |
+
|
| 1746 |
+
def sync_single_file(source_file: Path, destination_file: Path, force_copy: bool) -> tuple[bool, int]:
|
| 1747 |
+
destination_file.parent.mkdir(parents=True, exist_ok=True)
|
| 1748 |
+
size = source_file.stat().st_size
|
| 1749 |
+
|
| 1750 |
+
if (
|
| 1751 |
+
destination_file.exists()
|
| 1752 |
+
and not force_copy
|
| 1753 |
+
and destination_file.stat().st_size == size
|
| 1754 |
+
):
|
| 1755 |
+
return False, size
|
| 1756 |
+
|
| 1757 |
+
shutil.copy2(source_file, destination_file)
|
| 1758 |
+
return True, size
|
| 1759 |
+
|
| 1760 |
+
|
| 1761 |
+
def download_and_sync_model(
|
| 1762 |
+
context: RuntimeContext,
|
| 1763 |
+
repo_id: str,
|
| 1764 |
+
revision: str,
|
| 1765 |
+
force_download: bool,
|
| 1766 |
+
best_model_only: bool,
|
| 1767 |
+
) -> tuple[int, int, int, int]:
|
| 1768 |
+
from huggingface_hub import hf_hub_download
|
| 1769 |
+
|
| 1770 |
+
repo_files = resolve_repo_files(repo_id=repo_id, revision=revision, best_model_only=best_model_only)
|
| 1771 |
+
total_files = len(repo_files)
|
| 1772 |
+
total_bytes = sum(int(item["size"] or 0) for item in repo_files)
|
| 1773 |
+
|
| 1774 |
+
copied_files = 0
|
| 1775 |
+
skipped_files = 0
|
| 1776 |
+
copied_bytes = 0
|
| 1777 |
+
skipped_bytes = 0
|
| 1778 |
+
processed_bytes = 0
|
| 1779 |
+
download_scope = "best-model" if best_model_only else "toàn bộ model"
|
| 1780 |
+
|
| 1781 |
+
print_step(f"Tìm thấy {total_files} file cần đồng bộ cho {download_scope}.")
|
| 1782 |
+
|
| 1783 |
+
for index, repo_item in enumerate(repo_files, start=1):
|
| 1784 |
+
repo_file = str(repo_item["path"])
|
| 1785 |
+
destination_path = get_target_destination(context, repo_file)
|
| 1786 |
+
relative_label = destination_path.relative_to(context.root).as_posix()
|
| 1787 |
+
print_step(f"[{index}/{total_files}] Đang tải {relative_label}")
|
| 1788 |
+
|
| 1789 |
+
cached_file = hf_hub_download(
|
| 1790 |
+
repo_id=repo_id,
|
| 1791 |
+
repo_type="dataset",
|
| 1792 |
+
revision=revision,
|
| 1793 |
+
filename=repo_file,
|
| 1794 |
+
force_download=force_download,
|
| 1795 |
+
local_files_only=False,
|
| 1796 |
+
)
|
| 1797 |
+
|
| 1798 |
+
copied, size = sync_single_file(Path(cached_file), destination_path, force_copy=force_download)
|
| 1799 |
+
if copied:
|
| 1800 |
+
copied_files += 1
|
| 1801 |
+
copied_bytes += size
|
| 1802 |
+
print_step(f" Đã đồng bộ {relative_label} ({format_bytes(size)})")
|
| 1803 |
+
else:
|
| 1804 |
+
skipped_files += 1
|
| 1805 |
+
skipped_bytes += size
|
| 1806 |
+
print_step(f" Giữ nguyên {relative_label} ({format_bytes(size)})")
|
| 1807 |
+
|
| 1808 |
+
processed_bytes += size
|
| 1809 |
+
if processed_bytes > total_bytes:
|
| 1810 |
+
total_bytes = processed_bytes
|
| 1811 |
+
|
| 1812 |
+
print_step(
|
| 1813 |
+
" Tổng tiến độ "
|
| 1814 |
+
f"{render_progress_bar(processed_bytes, total_bytes)} "
|
| 1815 |
+
f"({format_bytes(processed_bytes)}/{format_bytes(total_bytes)})"
|
| 1816 |
+
)
|
| 1817 |
+
|
| 1818 |
+
return copied_files, skipped_files, copied_bytes, skipped_bytes
|
| 1819 |
+
|
| 1820 |
+
|
| 1821 |
+
def required_model_files(context: RuntimeContext, best_model_only: bool) -> list[Path]:
|
| 1822 |
+
if best_model_only:
|
| 1823 |
+
model_dir = context.local_best_model_dir
|
| 1824 |
+
else:
|
| 1825 |
+
model_dir = context.local_model_dir
|
| 1826 |
+
|
| 1827 |
+
return [
|
| 1828 |
+
model_dir / "config.json",
|
| 1829 |
+
model_dir / "generation_config.json",
|
| 1830 |
+
model_dir / "model.safetensors",
|
| 1831 |
+
model_dir / "tokenizer_config.json",
|
| 1832 |
+
model_dir / "special_tokens_map.json",
|
| 1833 |
+
model_dir / "spiece.model",
|
| 1834 |
+
]
|
| 1835 |
+
|
| 1836 |
+
|
| 1837 |
+
def validate_local_model_dir(context: RuntimeContext, best_model_only: bool) -> None:
|
| 1838 |
+
missing_files = [
|
| 1839 |
+
str(path.relative_to(context.root))
|
| 1840 |
+
for path in required_model_files(context, best_model_only)
|
| 1841 |
+
if not path.exists()
|
| 1842 |
+
]
|
| 1843 |
+
if missing_files:
|
| 1844 |
+
raise FileNotFoundError(
|
| 1845 |
+
"Model chưa đầy đủ sau khi tải về. Thiếu các file: " + ", ".join(missing_files)
|
| 1846 |
+
)
|
| 1847 |
+
|
| 1848 |
+
|
| 1849 |
+
def prepare_model(
|
| 1850 |
+
context: RuntimeContext,
|
| 1851 |
+
repo_id: str,
|
| 1852 |
+
revision: str,
|
| 1853 |
+
force_download: bool,
|
| 1854 |
+
skip_download: bool,
|
| 1855 |
+
best_model_only: bool,
|
| 1856 |
+
) -> None:
|
| 1857 |
+
if skip_download:
|
| 1858 |
+
print_step("Bỏ qua bước tải model theo yêu cầu `--skip-download`.")
|
| 1859 |
+
validate_local_model_dir(context, best_model_only=best_model_only)
|
| 1860 |
+
return
|
| 1861 |
+
|
| 1862 |
+
copied_files, skipped_files, copied_bytes, skipped_bytes = download_and_sync_model(
|
| 1863 |
+
context=context,
|
| 1864 |
+
repo_id=repo_id,
|
| 1865 |
+
revision=revision,
|
| 1866 |
+
force_download=force_download,
|
| 1867 |
+
best_model_only=best_model_only,
|
| 1868 |
+
)
|
| 1869 |
+
validate_local_model_dir(context, best_model_only=best_model_only)
|
| 1870 |
+
|
| 1871 |
+
scope = "best-model" if best_model_only else "toàn bộ model"
|
| 1872 |
+
print_step(
|
| 1873 |
+
f"Đồng bộ {scope} xong. "
|
| 1874 |
+
f"File mới/cập nhật: {copied_files} ({format_bytes(copied_bytes)}), "
|
| 1875 |
+
f"file giữ nguyên: {skipped_files} ({format_bytes(skipped_bytes)})."
|
| 1876 |
+
)
|
| 1877 |
+
|
| 1878 |
+
|
| 1879 |
+
def build_runtime_env(context: RuntimeContext, args: argparse.Namespace) -> dict[str, str]:
|
| 1880 |
+
env = os.environ.copy()
|
| 1881 |
+
|
| 1882 |
+
if args.host:
|
| 1883 |
+
env["HVU_HOST"] = args.host
|
| 1884 |
+
if args.port is not None:
|
| 1885 |
+
env["HVU_PORT"] = str(args.port)
|
| 1886 |
+
if args.device:
|
| 1887 |
+
env["HVU_DEVICE"] = args.device
|
| 1888 |
+
if args.debug:
|
| 1889 |
+
env["HVU_DEBUG"] = "1"
|
| 1890 |
+
if args.no_browser:
|
| 1891 |
+
env["HVU_OPEN_BROWSER"] = "0"
|
| 1892 |
+
|
| 1893 |
+
env["HVU_MODEL_DIR"] = str(context.local_model_dir)
|
| 1894 |
+
return env
|
| 1895 |
+
|
| 1896 |
+
|
| 1897 |
+
def launch_app(context: RuntimeContext, args: argparse.Namespace) -> int:
|
| 1898 |
+
if not context.main_file.exists():
|
| 1899 |
+
raise FileNotFoundError(f"Không tìm thấy file chạy ứng dụng: {context.main_file}")
|
| 1900 |
+
|
| 1901 |
+
env = build_runtime_env(context, args)
|
| 1902 |
+
command = [sys.executable, str(context.main_file)]
|
| 1903 |
+
|
| 1904 |
+
print_step("Đang chạy ứng dụng web...")
|
| 1905 |
+
print_step(
|
| 1906 |
+
"Mở trình duyệt tại "
|
| 1907 |
+
f"http://{env.get('HVU_HOST', '127.0.0.1')}:{env.get('HVU_PORT', '5000')}"
|
| 1908 |
+
)
|
| 1909 |
+
return subprocess.call(command, cwd=str(context.root), env=env)
|
| 1910 |
+
|
| 1911 |
+
|
| 1912 |
+
def build_parser() -> argparse.ArgumentParser:
|
| 1913 |
+
parser = argparse.ArgumentParser(
|
| 1914 |
+
description=(
|
| 1915 |
+
"Launcher cho HVU_QA: có thể chạy full project nếu đang đứng trong repo, "
|
| 1916 |
+
"hoặc tự dựng runtime standalone khi chỉ có file HVU_QA_tool.py."
|
| 1917 |
+
),
|
| 1918 |
+
)
|
| 1919 |
+
parser.add_argument("--repo-id", default=HF_DATASET_REPO_ID, help="Repo dataset trên Hugging Face.")
|
| 1920 |
+
parser.add_argument("--revision", default=HF_DATASET_REVISION, help="Revision trên Hugging Face.")
|
| 1921 |
+
parser.add_argument("--host", default=None, help="Host chạy Flask. Mặc định dùng HVU_HOST hoặc 127.0.0.1.")
|
| 1922 |
+
parser.add_argument("--port", type=int, default=None, help="Port chạy Flask. Mặc định dùng HVU_PORT hoặc 5000.")
|
| 1923 |
+
parser.add_argument(
|
| 1924 |
+
"--device",
|
| 1925 |
+
choices=["auto", "cpu", "cuda"],
|
| 1926 |
+
default=None,
|
| 1927 |
+
help="Thiết bị chạy model. Mặc định dùng HVU_DEVICE hoặc auto.",
|
| 1928 |
+
)
|
| 1929 |
+
parser.add_argument("--debug", action="store_true", help="Bật Flask debug.")
|
| 1930 |
+
parser.add_argument("--no-browser", action="store_true", help="Không tự mở trình duyệt.")
|
| 1931 |
+
parser.add_argument("--no-venv", action="store_true", help="Không tự tạo virtualenv riêng cho launcher.")
|
| 1932 |
+
parser.add_argument("--force-download", action="store_true", help="Tải lại model và ghi đè file local.")
|
| 1933 |
+
parser.add_argument(
|
| 1934 |
+
"--best-model-only",
|
| 1935 |
+
action="store_true",
|
| 1936 |
+
help="Chỉ tải thư mục best-model. Lệnh này chỉ dùng được khi repo thật sự có best-model.",
|
| 1937 |
+
)
|
| 1938 |
+
parser.add_argument("--skip-download", action="store_true", help="Bỏ qua bước tải model từ Hugging Face.")
|
| 1939 |
+
parser.add_argument("--skip-install", action="store_true", help="Không tự cài dependency còn thiếu.")
|
| 1940 |
+
parser.add_argument("--skip-run", action="store_true", help="Chỉ chuẩn bị môi trường và model, không chạy app.")
|
| 1941 |
+
parser.add_argument(
|
| 1942 |
+
"--runtime-dir",
|
| 1943 |
+
default="HVU_QA_runtime",
|
| 1944 |
+
help="Thư mục runtime standalone sẽ được tạo nếu không có full project hoặc khi ép standalone.",
|
| 1945 |
+
)
|
| 1946 |
+
parser.add_argument(
|
| 1947 |
+
"--force-standalone-runtime",
|
| 1948 |
+
action="store_true",
|
| 1949 |
+
help="Luôn dựng runtime standalone, kể cả khi đang đứng trong full project.",
|
| 1950 |
+
)
|
| 1951 |
+
parser.add_argument(
|
| 1952 |
+
"--force-runtime-refresh",
|
| 1953 |
+
action="store_true",
|
| 1954 |
+
help="Ghi đè lại các file runtime standalone được nhúng sẵn trong launcher.",
|
| 1955 |
+
)
|
| 1956 |
+
parser.add_argument(
|
| 1957 |
+
"--prepare-runtime-only",
|
| 1958 |
+
action="store_true",
|
| 1959 |
+
help="Chỉ dựng runtime standalone hoặc kiểm tra full project hiện tại, không cài dependency, không tải model.",
|
| 1960 |
+
)
|
| 1961 |
+
return parser
|
| 1962 |
+
|
| 1963 |
+
|
| 1964 |
+
def main() -> int:
|
| 1965 |
+
if hasattr(sys.stdout, "reconfigure"):
|
| 1966 |
+
sys.stdout.reconfigure(encoding="utf-8")
|
| 1967 |
+
if hasattr(sys.stderr, "reconfigure"):
|
| 1968 |
+
sys.stderr.reconfigure(encoding="utf-8")
|
| 1969 |
+
|
| 1970 |
+
parser = build_parser()
|
| 1971 |
+
args = parser.parse_args()
|
| 1972 |
+
|
| 1973 |
+
bootstrap_exit_code = maybe_bootstrap_tool_venv(args)
|
| 1974 |
+
if bootstrap_exit_code is not None:
|
| 1975 |
+
return bootstrap_exit_code
|
| 1976 |
+
|
| 1977 |
+
print_step("Bắt đầu chuẩn bị dự án HVU_QA...")
|
| 1978 |
+
context = resolve_runtime_context(args)
|
| 1979 |
+
|
| 1980 |
+
if args.prepare_runtime_only:
|
| 1981 |
+
print_step("Đã chuẩn bị xong runtime. Bỏ qua các bước tiếp theo theo `--prepare-runtime-only`.")
|
| 1982 |
+
return 0
|
| 1983 |
+
|
| 1984 |
+
ensure_huggingface_hub(skip_install=args.skip_install, context=context)
|
| 1985 |
+
prepare_model(
|
| 1986 |
+
context=context,
|
| 1987 |
+
repo_id=args.repo_id,
|
| 1988 |
+
revision=args.revision,
|
| 1989 |
+
force_download=args.force_download,
|
| 1990 |
+
skip_download=args.skip_download,
|
| 1991 |
+
best_model_only=args.best_model_only,
|
| 1992 |
+
)
|
| 1993 |
+
ensure_runtime_dependencies(skip_install=args.skip_install, context=context)
|
| 1994 |
+
|
| 1995 |
+
if args.skip_run:
|
| 1996 |
+
print_step("Đã chuẩn bị xong model và dependency. Bỏ qua chạy app theo `--skip-run`.")
|
| 1997 |
+
return 0
|
| 1998 |
+
|
| 1999 |
+
return launch_app(context, args)
|
| 2000 |
+
|
| 2001 |
+
|
| 2002 |
+
if __name__ == "__main__":
|
| 2003 |
+
raise SystemExit(main())
|
HVU_QA/backend/__init__.py
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from .app import create_app
|
| 2 |
+
|
| 3 |
+
__all__ = ["create_app"]
|
HVU_QA/backend/__pycache__/__init__.cpython-311.pyc
ADDED
|
Binary file (205 Bytes). View file
|
|
|
HVU_QA/backend/__pycache__/app.cpython-311.pyc
ADDED
|
Binary file (18.8 kB). View file
|
|
|
HVU_QA/backend/app.py
ADDED
|
@@ -0,0 +1,319 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from __future__ import annotations
|
| 2 |
+
|
| 3 |
+
import os
|
| 4 |
+
import time
|
| 5 |
+
from pathlib import Path
|
| 6 |
+
|
| 7 |
+
from flask import Flask, jsonify, request, send_from_directory
|
| 8 |
+
|
| 9 |
+
from generate_question import (
|
| 10 |
+
APP_TITLE,
|
| 11 |
+
QUESTION_LIMIT,
|
| 12 |
+
QuestionGenerator,
|
| 13 |
+
format_questions,
|
| 14 |
+
normalize_text,
|
| 15 |
+
parse_question_count,
|
| 16 |
+
resolve_model_dir,
|
| 17 |
+
)
|
| 18 |
+
|
| 19 |
+
IGNORED_MODEL_DIR_NAMES = {
|
| 20 |
+
".git",
|
| 21 |
+
".vscode",
|
| 22 |
+
"__pycache__",
|
| 23 |
+
"backend",
|
| 24 |
+
"frontend",
|
| 25 |
+
"venv",
|
| 26 |
+
}
|
| 27 |
+
|
| 28 |
+
|
| 29 |
+
def project_root() -> Path:
|
| 30 |
+
return Path(__file__).resolve().parents[1]
|
| 31 |
+
|
| 32 |
+
|
| 33 |
+
def build_generator(
|
| 34 |
+
model_dir: str | Path | None = None,
|
| 35 |
+
prefer_nested_model: bool = True,
|
| 36 |
+
) -> QuestionGenerator:
|
| 37 |
+
root = project_root()
|
| 38 |
+
selected_model_dir = (
|
| 39 |
+
Path(model_dir).expanduser()
|
| 40 |
+
if model_dir is not None
|
| 41 |
+
else Path(os.getenv("HVU_MODEL_DIR", str(root / "t5-viet-qg-finetuned"))).expanduser()
|
| 42 |
+
)
|
| 43 |
+
if not selected_model_dir.is_absolute():
|
| 44 |
+
selected_model_dir = root / selected_model_dir
|
| 45 |
+
|
| 46 |
+
return QuestionGenerator(
|
| 47 |
+
model_dir=str(selected_model_dir),
|
| 48 |
+
task_prefix=os.getenv("HVU_TASK_PREFIX", "sinh câu hỏi"),
|
| 49 |
+
max_source_length=int(os.getenv("HVU_MAX_SOURCE_LENGTH", "512")),
|
| 50 |
+
max_new_tokens=int(os.getenv("HVU_MAX_NEW_TOKENS", "64")),
|
| 51 |
+
device=os.getenv("HVU_DEVICE", "auto"),
|
| 52 |
+
cpu_threads=_read_optional_int(os.getenv("HVU_CPU_THREADS")),
|
| 53 |
+
gpu_dtype=os.getenv("HVU_GPU_DTYPE", "auto"),
|
| 54 |
+
prefer_nested_model=prefer_nested_model,
|
| 55 |
+
)
|
| 56 |
+
|
| 57 |
+
|
| 58 |
+
def _read_optional_int(value: str | None) -> int | None:
|
| 59 |
+
if value in (None, ""):
|
| 60 |
+
return None
|
| 61 |
+
return int(value)
|
| 62 |
+
|
| 63 |
+
|
| 64 |
+
def _humanize_model_segment(value: str) -> str:
|
| 65 |
+
normalized = value.replace("_", "-")
|
| 66 |
+
parts: list[str] = []
|
| 67 |
+
for part in normalized.split("-"):
|
| 68 |
+
lowered = part.lower()
|
| 69 |
+
if not lowered:
|
| 70 |
+
continue
|
| 71 |
+
if lowered in {"t5", "qg", "qa", "hvu"}:
|
| 72 |
+
parts.append(lowered.upper())
|
| 73 |
+
elif lowered == "seq2seq":
|
| 74 |
+
parts.append("Seq2Seq")
|
| 75 |
+
elif lowered == "checkpoint":
|
| 76 |
+
parts.append("Checkpoint")
|
| 77 |
+
elif part.isdigit():
|
| 78 |
+
parts.append(part)
|
| 79 |
+
else:
|
| 80 |
+
parts.append(part.capitalize())
|
| 81 |
+
return "-".join(parts) or "Model"
|
| 82 |
+
|
| 83 |
+
|
| 84 |
+
def _display_model_name(meta: dict[str, object]) -> str:
|
| 85 |
+
raw_name = Path(str(meta.get("model_root") or meta.get("model_dir") or "model")).name
|
| 86 |
+
return _humanize_model_segment(raw_name)
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
def _model_label(relative_path: str | Path) -> str:
|
| 90 |
+
path = Path(relative_path)
|
| 91 |
+
return path.name or "model"
|
| 92 |
+
|
| 93 |
+
|
| 94 |
+
def _iter_model_candidates(root: Path):
|
| 95 |
+
for child in sorted(root.iterdir(), key=lambda path: path.name.lower()):
|
| 96 |
+
if not child.is_dir() or child.name.startswith(".") or child.name in IGNORED_MODEL_DIR_NAMES:
|
| 97 |
+
continue
|
| 98 |
+
|
| 99 |
+
if (child / "config.json").exists():
|
| 100 |
+
yield {"path": child, "prefer_nested_model": False}
|
| 101 |
+
|
| 102 |
+
for nested_name in ("best-model", "final-model"):
|
| 103 |
+
nested = child / nested_name
|
| 104 |
+
if nested.is_dir() and (nested / "config.json").exists():
|
| 105 |
+
yield {"path": nested, "prefer_nested_model": False}
|
| 106 |
+
|
| 107 |
+
|
| 108 |
+
def _discover_available_models(
|
| 109 |
+
root: Path,
|
| 110 |
+
active_generator: QuestionGenerator | None = None,
|
| 111 |
+
) -> list[dict[str, str]]:
|
| 112 |
+
models: list[dict[str, str]] = []
|
| 113 |
+
seen_model_roots: set[str] = set()
|
| 114 |
+
root = root.resolve()
|
| 115 |
+
|
| 116 |
+
for candidate_info in _iter_model_candidates(root):
|
| 117 |
+
candidate = candidate_info["path"]
|
| 118 |
+
prefer_nested_model = bool(candidate_info["prefer_nested_model"])
|
| 119 |
+
model_key = str(candidate.resolve())
|
| 120 |
+
if model_key in seen_model_roots:
|
| 121 |
+
continue
|
| 122 |
+
|
| 123 |
+
try:
|
| 124 |
+
relative_candidate = candidate.resolve().relative_to(root)
|
| 125 |
+
except ValueError:
|
| 126 |
+
continue
|
| 127 |
+
|
| 128 |
+
seen_model_roots.add(model_key)
|
| 129 |
+
models.append(
|
| 130 |
+
{
|
| 131 |
+
"id": relative_candidate.as_posix(),
|
| 132 |
+
"label": _model_label(relative_candidate),
|
| 133 |
+
"model_root": str(candidate.resolve()),
|
| 134 |
+
"model_dir": str(resolve_model_dir(candidate, prefer_nested_model=False).resolve()),
|
| 135 |
+
"prefer_nested_model": prefer_nested_model,
|
| 136 |
+
}
|
| 137 |
+
)
|
| 138 |
+
|
| 139 |
+
if active_generator is not None:
|
| 140 |
+
current_root = active_generator.model_root.resolve()
|
| 141 |
+
current_dir = active_generator.model_dir.resolve()
|
| 142 |
+
exists = any(
|
| 143 |
+
Path(item["model_root"]).resolve() == current_root
|
| 144 |
+
or Path(item["model_dir"]).resolve() == current_dir
|
| 145 |
+
for item in models
|
| 146 |
+
)
|
| 147 |
+
if not exists:
|
| 148 |
+
models.append(
|
| 149 |
+
{
|
| 150 |
+
"id": current_root.as_posix(),
|
| 151 |
+
"label": _display_model_name(active_generator.metadata()),
|
| 152 |
+
"model_root": str(current_root),
|
| 153 |
+
"model_dir": str(current_dir),
|
| 154 |
+
"prefer_nested_model": False,
|
| 155 |
+
}
|
| 156 |
+
)
|
| 157 |
+
|
| 158 |
+
return models
|
| 159 |
+
|
| 160 |
+
|
| 161 |
+
def _selected_model_id(
|
| 162 |
+
app: Flask,
|
| 163 |
+
models: list[dict[str, str]],
|
| 164 |
+
active_generator: QuestionGenerator | None = None,
|
| 165 |
+
) -> str:
|
| 166 |
+
explicit_selection = str(app.config.get("SELECTED_MODEL_ID") or "").strip()
|
| 167 |
+
if explicit_selection and any(item["id"] == explicit_selection for item in models):
|
| 168 |
+
return explicit_selection
|
| 169 |
+
|
| 170 |
+
active_generator = active_generator or _generator(app)
|
| 171 |
+
current_root = active_generator.model_root.resolve()
|
| 172 |
+
current_dir = active_generator.model_dir.resolve()
|
| 173 |
+
|
| 174 |
+
for item in models:
|
| 175 |
+
if Path(item["model_dir"]).resolve() == current_dir:
|
| 176 |
+
return item["id"]
|
| 177 |
+
|
| 178 |
+
for item in models:
|
| 179 |
+
if Path(item["model_root"]).resolve() == current_root:
|
| 180 |
+
return item["id"]
|
| 181 |
+
|
| 182 |
+
return models[0]["id"] if models else ""
|
| 183 |
+
|
| 184 |
+
|
| 185 |
+
def _switch_generator(app: Flask, model_id: str) -> QuestionGenerator:
|
| 186 |
+
available_models = _discover_available_models(app.config["PROJECT_ROOT"], _generator(app))
|
| 187 |
+
selected_model = next((item for item in available_models if item["id"] == model_id), None)
|
| 188 |
+
if selected_model is None:
|
| 189 |
+
raise ValueError("Model được chọn không hợp lệ hoặc chưa tồn tại trong thư mục dự án.")
|
| 190 |
+
|
| 191 |
+
current_model_id = _selected_model_id(app, available_models)
|
| 192 |
+
if current_model_id != model_id:
|
| 193 |
+
app.config["GENERATOR"] = build_generator(
|
| 194 |
+
selected_model["model_root"],
|
| 195 |
+
prefer_nested_model=bool(selected_model.get("prefer_nested_model")),
|
| 196 |
+
)
|
| 197 |
+
|
| 198 |
+
app.config["SELECTED_MODEL_ID"] = model_id
|
| 199 |
+
return _generator(app)
|
| 200 |
+
|
| 201 |
+
|
| 202 |
+
def _info_payload(app: Flask, active_generator: QuestionGenerator | None = None) -> dict[str, object]:
|
| 203 |
+
active_generator = active_generator or _generator(app)
|
| 204 |
+
meta = active_generator.metadata()
|
| 205 |
+
available_models = _discover_available_models(app.config["PROJECT_ROOT"], active_generator)
|
| 206 |
+
selected_model_id = _selected_model_id(app, available_models, active_generator)
|
| 207 |
+
model_name = next(
|
| 208 |
+
(item["label"] for item in available_models if item["id"] == selected_model_id),
|
| 209 |
+
_display_model_name(meta),
|
| 210 |
+
)
|
| 211 |
+
|
| 212 |
+
return {
|
| 213 |
+
"ok": True,
|
| 214 |
+
"title": APP_TITLE,
|
| 215 |
+
"model_name": model_name,
|
| 216 |
+
"selected_model_id": selected_model_id,
|
| 217 |
+
"available_models": [{"id": item["id"], "label": item["label"]} for item in available_models],
|
| 218 |
+
"meta": meta,
|
| 219 |
+
}
|
| 220 |
+
|
| 221 |
+
|
| 222 |
+
def create_app(generator: QuestionGenerator | None = None) -> Flask:
|
| 223 |
+
root = project_root()
|
| 224 |
+
frontend_root = root / "frontend"
|
| 225 |
+
|
| 226 |
+
app = Flask(__name__, static_folder=None)
|
| 227 |
+
app.json.ensure_ascii = False
|
| 228 |
+
app.config["GENERATOR"] = generator or build_generator()
|
| 229 |
+
app.config["PROJECT_ROOT"] = root
|
| 230 |
+
app.config["FRONTEND_ROOT"] = frontend_root
|
| 231 |
+
app.config["SELECTED_MODEL_ID"] = ""
|
| 232 |
+
|
| 233 |
+
@app.get("/")
|
| 234 |
+
def index():
|
| 235 |
+
return send_from_directory(app.config["FRONTEND_ROOT"], "index.html")
|
| 236 |
+
|
| 237 |
+
@app.get("/frontend/<path:filename>")
|
| 238 |
+
def frontend_file(filename: str):
|
| 239 |
+
return send_from_directory(app.config["FRONTEND_ROOT"], filename)
|
| 240 |
+
|
| 241 |
+
@app.get("/assets/<path:filename>")
|
| 242 |
+
def asset_file(filename: str):
|
| 243 |
+
return send_from_directory(app.config["PROJECT_ROOT"], filename)
|
| 244 |
+
|
| 245 |
+
@app.get("/api/info")
|
| 246 |
+
def info():
|
| 247 |
+
return jsonify(_info_payload(app))
|
| 248 |
+
|
| 249 |
+
@app.post("/api/model")
|
| 250 |
+
def set_model():
|
| 251 |
+
payload = request.get_json(silent=True) or {}
|
| 252 |
+
model_id = str(payload.get("model_id") or "").strip()
|
| 253 |
+
if not model_id:
|
| 254 |
+
return jsonify({"ok": False, "error": "Vui lòng chọn model trước khi chuyển."}), 400
|
| 255 |
+
|
| 256 |
+
try:
|
| 257 |
+
active_generator = _switch_generator(app, model_id)
|
| 258 |
+
except ValueError as exc:
|
| 259 |
+
return jsonify({"ok": False, "error": str(exc)}), 404
|
| 260 |
+
|
| 261 |
+
return jsonify(_info_payload(app, active_generator))
|
| 262 |
+
|
| 263 |
+
@app.post("/api/generate")
|
| 264 |
+
def generate():
|
| 265 |
+
payload = request.get_json(silent=True) or {}
|
| 266 |
+
requested_model_id = str(payload.get("model_id") or "").strip()
|
| 267 |
+
|
| 268 |
+
if requested_model_id:
|
| 269 |
+
try:
|
| 270 |
+
active_generator = _switch_generator(app, requested_model_id)
|
| 271 |
+
except ValueError as exc:
|
| 272 |
+
return jsonify({"ok": False, "error": str(exc)}), 400
|
| 273 |
+
else:
|
| 274 |
+
active_generator = _generator(app)
|
| 275 |
+
|
| 276 |
+
text = normalize_text(payload.get("text"))
|
| 277 |
+
if not text:
|
| 278 |
+
return jsonify({"ok": False, "error": "Vui lòng nhập đoạn văn bản trước khi sinh câu hỏi."}), 400
|
| 279 |
+
|
| 280 |
+
raw_count = payload.get("num_questions")
|
| 281 |
+
if raw_count in (None, ""):
|
| 282 |
+
count = 100
|
| 283 |
+
else:
|
| 284 |
+
try:
|
| 285 |
+
count = int(raw_count)
|
| 286 |
+
except (TypeError, ValueError):
|
| 287 |
+
return jsonify({"ok": False, "error": "Số câu hỏi phải là số nguyên trong khoảng 1 đến 100."}), 400
|
| 288 |
+
|
| 289 |
+
if count < 1 or count > QUESTION_LIMIT:
|
| 290 |
+
return jsonify({"ok": False, "error": f"Số câu hỏi phải nằm trong khoảng 1 đến {QUESTION_LIMIT}."}), 400
|
| 291 |
+
|
| 292 |
+
started = time.perf_counter()
|
| 293 |
+
try:
|
| 294 |
+
questions = active_generator.generate(text, parse_question_count(count))
|
| 295 |
+
except Exception as exc: # noqa: BLE001
|
| 296 |
+
return jsonify({"ok": False, "error": str(exc)}), 500
|
| 297 |
+
|
| 298 |
+
elapsed_ms = round((time.perf_counter() - started) * 1000, 2)
|
| 299 |
+
info_payload = _info_payload(app, active_generator)
|
| 300 |
+
return jsonify(
|
| 301 |
+
{
|
| 302 |
+
"ok": True,
|
| 303 |
+
"text": text,
|
| 304 |
+
"num_questions": count,
|
| 305 |
+
"questions": questions,
|
| 306 |
+
"formatted": format_questions(questions),
|
| 307 |
+
"elapsed_ms": elapsed_ms,
|
| 308 |
+
"model_name": info_payload["model_name"],
|
| 309 |
+
"selected_model_id": info_payload["selected_model_id"],
|
| 310 |
+
"meta": active_generator.metadata(),
|
| 311 |
+
}
|
| 312 |
+
)
|
| 313 |
+
|
| 314 |
+
return app
|
| 315 |
+
|
| 316 |
+
|
| 317 |
+
def _generator(app: Flask) -> QuestionGenerator:
|
| 318 |
+
generator: QuestionGenerator = app.config["GENERATOR"]
|
| 319 |
+
return generator
|
HVU_QA/fine_tune_qg.py
ADDED
|
@@ -0,0 +1,556 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from __future__ import annotations
|
| 2 |
+
|
| 3 |
+
import argparse
|
| 4 |
+
import json
|
| 5 |
+
import os
|
| 6 |
+
import subprocess
|
| 7 |
+
import sys
|
| 8 |
+
from importlib import metadata
|
| 9 |
+
from inspect import signature
|
| 10 |
+
from pathlib import Path
|
| 11 |
+
from typing import Any
|
| 12 |
+
|
| 13 |
+
os.environ.setdefault("TOKENIZERS_PARALLELISM", "false")
|
| 14 |
+
os.environ.setdefault("PYTORCH_CUDA_ALLOC_CONF", "expandable_segments:True")
|
| 15 |
+
|
| 16 |
+
|
| 17 |
+
def raise_missing_dependency_error(exc: ModuleNotFoundError) -> None:
|
| 18 |
+
root = Path(__file__).resolve().parent
|
| 19 |
+
script = Path(__file__).resolve()
|
| 20 |
+
requirements = root / "requirements.txt"
|
| 21 |
+
venv_python = root / "venv" / ("Scripts/python.exe" if os.name == "nt" else "bin/python")
|
| 22 |
+
lines = [f"Thiếu thư viện Python: {exc.name}", f"Interpreter hiện tại: {sys.executable}"]
|
| 23 |
+
if venv_python.exists():
|
| 24 |
+
lines.extend([f"Venv của project: {venv_python}", f"Chạy lại bằng: {venv_python} {script}"])
|
| 25 |
+
if requirements.exists():
|
| 26 |
+
lines.extend(
|
| 27 |
+
[
|
| 28 |
+
"Hoặc cài dependencies cho interpreter hiện tại bằng:",
|
| 29 |
+
f"{sys.executable} -m pip install -r {requirements}",
|
| 30 |
+
]
|
| 31 |
+
)
|
| 32 |
+
raise SystemExit("\n".join(lines)) from exc
|
| 33 |
+
|
| 34 |
+
|
| 35 |
+
try:
|
| 36 |
+
import torch
|
| 37 |
+
from datasets import Dataset
|
| 38 |
+
from transformers import (
|
| 39 |
+
AutoModelForSeq2SeqLM,
|
| 40 |
+
AutoTokenizer,
|
| 41 |
+
DataCollatorForSeq2Seq,
|
| 42 |
+
EarlyStoppingCallback,
|
| 43 |
+
Seq2SeqTrainer,
|
| 44 |
+
Seq2SeqTrainingArguments,
|
| 45 |
+
set_seed,
|
| 46 |
+
)
|
| 47 |
+
from transformers.trainer_utils import get_last_checkpoint
|
| 48 |
+
except ModuleNotFoundError as exc:
|
| 49 |
+
raise_missing_dependency_error(exc)
|
| 50 |
+
|
| 51 |
+
|
| 52 |
+
def normalize_text(text: Any) -> str:
|
| 53 |
+
return " ".join(str(text or "").split())
|
| 54 |
+
|
| 55 |
+
|
| 56 |
+
def dedupe(items) -> list[str]:
|
| 57 |
+
seen, output = set(), []
|
| 58 |
+
for item in items:
|
| 59 |
+
if item and item not in seen:
|
| 60 |
+
seen.add(item)
|
| 61 |
+
output.append(item)
|
| 62 |
+
return output
|
| 63 |
+
|
| 64 |
+
|
| 65 |
+
def save_json(data: dict[str, Any], path: Path) -> None:
|
| 66 |
+
path.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
|
| 67 |
+
|
| 68 |
+
|
| 69 |
+
def get_installed_version(package_name: str) -> tuple[int, ...]:
|
| 70 |
+
try:
|
| 71 |
+
version = metadata.version(package_name)
|
| 72 |
+
except metadata.PackageNotFoundError:
|
| 73 |
+
return ()
|
| 74 |
+
|
| 75 |
+
parts = []
|
| 76 |
+
for chunk in version.replace("-", ".").split("."):
|
| 77 |
+
digits = "".join(ch for ch in chunk if ch.isdigit())
|
| 78 |
+
if not digits:
|
| 79 |
+
break
|
| 80 |
+
parts.append(int(digits))
|
| 81 |
+
return tuple(parts)
|
| 82 |
+
|
| 83 |
+
|
| 84 |
+
def supports_data_seed() -> bool:
|
| 85 |
+
return get_installed_version("accelerate") >= (1, 1, 0)
|
| 86 |
+
|
| 87 |
+
|
| 88 |
+
def run_nvidia_smi(query: str) -> list[list[str]]:
|
| 89 |
+
try:
|
| 90 |
+
result = subprocess.run(
|
| 91 |
+
["nvidia-smi", f"--query-{query}", "--format=csv,noheader,nounits"],
|
| 92 |
+
check=True,
|
| 93 |
+
capture_output=True,
|
| 94 |
+
text=True,
|
| 95 |
+
)
|
| 96 |
+
except (FileNotFoundError, subprocess.CalledProcessError):
|
| 97 |
+
return []
|
| 98 |
+
|
| 99 |
+
return [
|
| 100 |
+
[part.strip() for part in line.split(",")]
|
| 101 |
+
for line in result.stdout.strip().splitlines()
|
| 102 |
+
if line.strip()
|
| 103 |
+
]
|
| 104 |
+
|
| 105 |
+
|
| 106 |
+
def query_gpu_memory():
|
| 107 |
+
rows = run_nvidia_smi("gpu=memory.total,memory.used,memory.free")
|
| 108 |
+
if not rows or len(rows[0]) < 3:
|
| 109 |
+
return None
|
| 110 |
+
try:
|
| 111 |
+
total_mb, used_mb, free_mb = (int(value) for value in rows[0][:3])
|
| 112 |
+
except ValueError:
|
| 113 |
+
return None
|
| 114 |
+
return {"total_mb": total_mb, "used_mb": used_mb, "free_mb": free_mb}
|
| 115 |
+
|
| 116 |
+
|
| 117 |
+
def query_gpu_processes() -> list[dict[str, Any]]:
|
| 118 |
+
processes = []
|
| 119 |
+
for row in run_nvidia_smi("compute-apps=pid,process_name,used_memory"):
|
| 120 |
+
if len(row) != 3:
|
| 121 |
+
continue
|
| 122 |
+
try:
|
| 123 |
+
pid = int(row[0])
|
| 124 |
+
used_memory_mb = int(row[2])
|
| 125 |
+
except ValueError:
|
| 126 |
+
continue
|
| 127 |
+
processes.append({"pid": pid, "process_name": row[1], "used_memory_mb": used_memory_mb})
|
| 128 |
+
return processes
|
| 129 |
+
|
| 130 |
+
|
| 131 |
+
def format_memory_mb(memory_mb: int) -> str:
|
| 132 |
+
return f"{memory_mb} MiB ({memory_mb / 1024:.2f} GiB)"
|
| 133 |
+
|
| 134 |
+
|
| 135 |
+
def active_gpu_processes() -> list[dict[str, Any]]:
|
| 136 |
+
current_pid = os.getpid()
|
| 137 |
+
return sorted(
|
| 138 |
+
[proc for proc in query_gpu_processes() if proc["pid"] != current_pid and proc["used_memory_mb"] > 0],
|
| 139 |
+
key=lambda item: item["used_memory_mb"],
|
| 140 |
+
reverse=True,
|
| 141 |
+
)
|
| 142 |
+
|
| 143 |
+
|
| 144 |
+
def append_process_lines(lines: list[str], header: str, processes: list[dict[str, Any]]) -> None:
|
| 145 |
+
if not processes:
|
| 146 |
+
return
|
| 147 |
+
lines.append(header)
|
| 148 |
+
lines.extend(
|
| 149 |
+
f"- PID {proc['pid']} | {proc['process_name']} | {format_memory_mb(proc['used_memory_mb'])}"
|
| 150 |
+
for proc in processes
|
| 151 |
+
)
|
| 152 |
+
|
| 153 |
+
|
| 154 |
+
def ensure_device_ready(args) -> None:
|
| 155 |
+
if args.device == "cpu":
|
| 156 |
+
return
|
| 157 |
+
if not torch.cuda.is_available():
|
| 158 |
+
if args.device == "cuda":
|
| 159 |
+
raise SystemExit("Bạn đã chọn --device cuda nhưng môi trường hiện tại không có CUDA.")
|
| 160 |
+
return
|
| 161 |
+
if args.skip_gpu_preflight:
|
| 162 |
+
return
|
| 163 |
+
|
| 164 |
+
gpu_memory = query_gpu_memory()
|
| 165 |
+
if gpu_memory is None or gpu_memory["free_mb"] >= args.min_free_gpu_mb:
|
| 166 |
+
return
|
| 167 |
+
|
| 168 |
+
lines = [
|
| 169 |
+
"GPU không đủ bộ nhớ để bắt đầu train ổn định.",
|
| 170 |
+
f"GPU free: {format_memory_mb(gpu_memory['free_mb'])} / total: {format_memory_mb(gpu_memory['total_mb'])}.",
|
| 171 |
+
f"Ngưỡng tối thiểu hiện tại: {format_memory_mb(args.min_free_gpu_mb)}.",
|
| 172 |
+
]
|
| 173 |
+
append_process_lines(lines, "Các tiến trình CUDA đang chiếm GPU:", active_gpu_processes())
|
| 174 |
+
lines.extend(
|
| 175 |
+
[
|
| 176 |
+
"Cách xử lý:",
|
| 177 |
+
"- Giải phóng tiến trình đang chiếm GPU rồi chạy lại.",
|
| 178 |
+
"- Hoặc train trên CPU bằng `python fine_tune_qg.py --device cpu`.",
|
| 179 |
+
"- Nếu GPU đã rảnh mà vẫn thiếu VRAM, thử `--per_device_train_batch_size 1 --per_device_eval_batch_size 1 --gradient_accumulation_steps 16 --gradient_checkpointing`.",
|
| 180 |
+
"- Nếu bạn vẫn muốn thử trên GPU hiện tại, thêm `--skip_gpu_preflight` để bỏ qua kiểm tra này.",
|
| 181 |
+
]
|
| 182 |
+
)
|
| 183 |
+
raise SystemExit("\n".join(lines))
|
| 184 |
+
|
| 185 |
+
|
| 186 |
+
def raise_cuda_oom(args) -> None:
|
| 187 |
+
gpu_memory = query_gpu_memory()
|
| 188 |
+
lines = ["Train thất bại do CUDA out of memory."]
|
| 189 |
+
if gpu_memory is not None:
|
| 190 |
+
lines.append(
|
| 191 |
+
f"VRAM hiện tại: free {format_memory_mb(gpu_memory['free_mb'])}, used {format_memory_mb(gpu_memory['used_mb'])}, total {format_memory_mb(gpu_memory['total_mb'])}."
|
| 192 |
+
)
|
| 193 |
+
append_process_lines(lines, "Các tiến trình khác đang dùng GPU:", active_gpu_processes())
|
| 194 |
+
lines.extend(
|
| 195 |
+
[
|
| 196 |
+
"Gợi ý:",
|
| 197 |
+
"- Dừng tiến trình CUDA khác rồi chạy lại.",
|
| 198 |
+
f"- Hoặc chạy trên CPU: python fine_tune_qg.py --device cpu --output_dir {args.output_dir}-cpu",
|
| 199 |
+
"- Khi GPU rảnh, nếu vẫn thiếu VRAM, giảm batch: `--per_device_train_batch_size 1 --per_device_eval_batch_size 1 --gradient_accumulation_steps 16 --gradient_checkpointing`.",
|
| 200 |
+
]
|
| 201 |
+
)
|
| 202 |
+
raise SystemExit("\n".join(lines))
|
| 203 |
+
|
| 204 |
+
|
| 205 |
+
def build_source(title: str, context: str, answer: str, task_prefix: str) -> str:
|
| 206 |
+
parts = [f"{task_prefix}:"]
|
| 207 |
+
if title:
|
| 208 |
+
parts.append(f"tiêu đề: {title}")
|
| 209 |
+
parts.extend((f"ngữ cảnh: {context}", f"đáp án: {answer}"))
|
| 210 |
+
return "\n".join(parts)
|
| 211 |
+
|
| 212 |
+
|
| 213 |
+
def load_squad_qg_examples(
|
| 214 |
+
file_path: str,
|
| 215 |
+
use_all_answers: bool = True,
|
| 216 |
+
task_prefix: str = "sinh câu hỏi",
|
| 217 |
+
require_answer_in_context: bool = False,
|
| 218 |
+
) -> tuple[list[dict[str, str]], dict[str, int]]:
|
| 219 |
+
data = json.loads(Path(file_path).read_text(encoding="utf-8"))
|
| 220 |
+
examples = []
|
| 221 |
+
stats = {
|
| 222 |
+
"articles": 0,
|
| 223 |
+
"paragraphs": 0,
|
| 224 |
+
"qas": 0,
|
| 225 |
+
"examples": 0,
|
| 226 |
+
"skipped_impossible": 0,
|
| 227 |
+
"skipped_no_context": 0,
|
| 228 |
+
"skipped_no_question": 0,
|
| 229 |
+
"skipped_no_answers": 0,
|
| 230 |
+
"skipped_answer_not_in_context": 0,
|
| 231 |
+
"answers_not_in_context_but_kept": 0,
|
| 232 |
+
}
|
| 233 |
+
|
| 234 |
+
for article in data.get("data", []):
|
| 235 |
+
stats["articles"] += 1
|
| 236 |
+
title = normalize_text(article.get("title"))
|
| 237 |
+
for paragraph in article.get("paragraphs", []):
|
| 238 |
+
stats["paragraphs"] += 1
|
| 239 |
+
context = normalize_text(paragraph.get("context"))
|
| 240 |
+
if not context:
|
| 241 |
+
stats["skipped_no_context"] += 1
|
| 242 |
+
continue
|
| 243 |
+
|
| 244 |
+
for qa in paragraph.get("qas", []):
|
| 245 |
+
stats["qas"] += 1
|
| 246 |
+
question = normalize_text(qa.get("question"))
|
| 247 |
+
if qa.get("is_impossible", False):
|
| 248 |
+
stats["skipped_impossible"] += 1
|
| 249 |
+
continue
|
| 250 |
+
if not question:
|
| 251 |
+
stats["skipped_no_question"] += 1
|
| 252 |
+
continue
|
| 253 |
+
|
| 254 |
+
answers = dedupe(normalize_text(answer.get("text")) for answer in qa.get("answers", []))
|
| 255 |
+
if not answers:
|
| 256 |
+
stats["skipped_no_answers"] += 1
|
| 257 |
+
continue
|
| 258 |
+
if not use_all_answers:
|
| 259 |
+
answers = answers[:1]
|
| 260 |
+
|
| 261 |
+
for answer in answers:
|
| 262 |
+
in_context = answer in context
|
| 263 |
+
if require_answer_in_context and not in_context:
|
| 264 |
+
stats["skipped_answer_not_in_context"] += 1
|
| 265 |
+
continue
|
| 266 |
+
if not in_context:
|
| 267 |
+
stats["answers_not_in_context_but_kept"] += 1
|
| 268 |
+
|
| 269 |
+
examples.append(
|
| 270 |
+
{
|
| 271 |
+
"source": build_source(title, context, answer, task_prefix),
|
| 272 |
+
"target": question,
|
| 273 |
+
}
|
| 274 |
+
)
|
| 275 |
+
stats["examples"] += 1
|
| 276 |
+
|
| 277 |
+
return examples, stats
|
| 278 |
+
|
| 279 |
+
|
| 280 |
+
def preprocess_function(batch, tokenizer, max_source_length: int, max_target_length: int) -> dict[str, Any]:
|
| 281 |
+
model_inputs = tokenizer(batch["source"], max_length=max_source_length, truncation=True)
|
| 282 |
+
model_inputs["labels"] = tokenizer(
|
| 283 |
+
text_target=batch["target"],
|
| 284 |
+
max_length=max_target_length,
|
| 285 |
+
truncation=True,
|
| 286 |
+
)["input_ids"]
|
| 287 |
+
return model_inputs
|
| 288 |
+
|
| 289 |
+
|
| 290 |
+
def build_supported_kwargs(cls, kwargs: dict[str, Any], aliases=None) -> dict[str, Any]:
|
| 291 |
+
params = set(signature(cls.__init__).parameters)
|
| 292 |
+
aliases = aliases or {}
|
| 293 |
+
resolved = {}
|
| 294 |
+
for key, value in kwargs.items():
|
| 295 |
+
if value is None:
|
| 296 |
+
continue
|
| 297 |
+
target = key if key in params else aliases.get(key)
|
| 298 |
+
if target in params:
|
| 299 |
+
resolved[target] = value
|
| 300 |
+
return resolved
|
| 301 |
+
|
| 302 |
+
|
| 303 |
+
def build_training_args(args, has_eval: bool):
|
| 304 |
+
kwargs = {
|
| 305 |
+
"output_dir": args.output_dir,
|
| 306 |
+
"overwrite_output_dir": False,
|
| 307 |
+
"learning_rate": args.learning_rate,
|
| 308 |
+
"per_device_train_batch_size": args.per_device_train_batch_size,
|
| 309 |
+
"per_device_eval_batch_size": args.per_device_eval_batch_size,
|
| 310 |
+
"gradient_accumulation_steps": args.gradient_accumulation_steps,
|
| 311 |
+
"weight_decay": args.weight_decay,
|
| 312 |
+
"num_train_epochs": args.num_train_epochs,
|
| 313 |
+
"warmup_ratio": args.warmup_ratio,
|
| 314 |
+
"logging_strategy": "steps",
|
| 315 |
+
"logging_steps": args.logging_steps,
|
| 316 |
+
"save_strategy": args.save_strategy_type,
|
| 317 |
+
"save_steps": args.save_steps if args.save_strategy_type == "steps" else None,
|
| 318 |
+
"save_total_limit": args.save_total_limit,
|
| 319 |
+
"report_to": "none",
|
| 320 |
+
"fp16": args.fp16,
|
| 321 |
+
"bf16": args.bf16,
|
| 322 |
+
"predict_with_generate": False,
|
| 323 |
+
"generation_max_length": args.max_target_length,
|
| 324 |
+
"dataloader_num_workers": args.dataloader_num_workers,
|
| 325 |
+
"dataloader_pin_memory": not args.no_pin_memory,
|
| 326 |
+
"save_only_model": args.save_only_model,
|
| 327 |
+
"restore_callback_states_from_checkpoint": args.restore_callback_states_from_checkpoint,
|
| 328 |
+
"torch_empty_cache_steps": args.torch_empty_cache_steps or None,
|
| 329 |
+
"seed": args.seed,
|
| 330 |
+
"data_seed": args.seed if supports_data_seed() else None,
|
| 331 |
+
"use_cpu": True if args.device == "cpu" else None,
|
| 332 |
+
"gradient_checkpointing": True if args.gradient_checkpointing else None,
|
| 333 |
+
"load_best_model_at_end": has_eval,
|
| 334 |
+
"metric_for_best_model": "eval_loss" if has_eval else None,
|
| 335 |
+
"greater_is_better": False if has_eval else None,
|
| 336 |
+
"eval_strategy": args.save_strategy_type if has_eval else None,
|
| 337 |
+
"eval_steps": args.eval_steps if has_eval and args.save_strategy_type == "steps" else None,
|
| 338 |
+
}
|
| 339 |
+
return Seq2SeqTrainingArguments(
|
| 340 |
+
**build_supported_kwargs(
|
| 341 |
+
Seq2SeqTrainingArguments,
|
| 342 |
+
kwargs,
|
| 343 |
+
aliases={"eval_strategy": "evaluation_strategy", "use_cpu": "no_cuda"},
|
| 344 |
+
)
|
| 345 |
+
)
|
| 346 |
+
|
| 347 |
+
|
| 348 |
+
def resolve_resume_checkpoint(args):
|
| 349 |
+
if args.resume_checkpoint:
|
| 350 |
+
if not Path(args.resume_checkpoint).is_dir():
|
| 351 |
+
raise FileNotFoundError(f"Không tìm thấy resume_checkpoint: {args.resume_checkpoint}")
|
| 352 |
+
return args.resume_checkpoint
|
| 353 |
+
if args.resume_from_latest and Path(args.output_dir).is_dir():
|
| 354 |
+
return get_last_checkpoint(args.output_dir)
|
| 355 |
+
return None
|
| 356 |
+
|
| 357 |
+
|
| 358 |
+
def validate_args(args, has_eval: bool) -> None:
|
| 359 |
+
if has_eval and args.save_strategy_type == "steps":
|
| 360 |
+
if args.eval_steps <= 0 or args.save_steps <= 0:
|
| 361 |
+
raise ValueError("save_steps và eval_steps phải > 0")
|
| 362 |
+
if args.save_steps % args.eval_steps != 0:
|
| 363 |
+
raise ValueError("save_steps phải là bội số của eval_steps")
|
| 364 |
+
if args.save_only_model and (args.resume_from_latest or args.resume_checkpoint):
|
| 365 |
+
print("Cảnh báo: save_only_model sẽ không resume train đầy đủ được.")
|
| 366 |
+
|
| 367 |
+
|
| 368 |
+
def build_parser() -> argparse.ArgumentParser:
|
| 369 |
+
parser = argparse.ArgumentParser()
|
| 370 |
+
add = parser.add_argument
|
| 371 |
+
|
| 372 |
+
add("--train_file", default="40k_train.json")
|
| 373 |
+
add("--validation_file", default=None)
|
| 374 |
+
add("--output_dir", default="t5-viet-qg-finetuned")
|
| 375 |
+
add("--model_name", default="VietAI/vit5-base")
|
| 376 |
+
add("--task_prefix", default="sinh câu hỏi")
|
| 377 |
+
|
| 378 |
+
add("--max_source_length", type=int, default=512)
|
| 379 |
+
add("--max_target_length", type=int, default=64)
|
| 380 |
+
add("--val_ratio", type=float, default=0.1)
|
| 381 |
+
|
| 382 |
+
add("--per_device_train_batch_size", type=int, default=4)
|
| 383 |
+
add("--per_device_eval_batch_size", type=int, default=4)
|
| 384 |
+
add("--gradient_accumulation_steps", type=int, default=4)
|
| 385 |
+
add("--learning_rate", type=float, default=1e-4)
|
| 386 |
+
add("--weight_decay", type=float, default=0.01)
|
| 387 |
+
add("--warmup_ratio", type=float, default=0.05)
|
| 388 |
+
add("--num_train_epochs", type=int, default=3)
|
| 389 |
+
add("--logging_steps", type=int, default=50)
|
| 390 |
+
add("--seed", type=int, default=42)
|
| 391 |
+
add("--early_stopping_patience", type=int, default=2)
|
| 392 |
+
|
| 393 |
+
add("--save_strategy_type", default="steps", choices=["steps", "epoch"])
|
| 394 |
+
add("--save_steps", type=int, default=500)
|
| 395 |
+
add("--eval_steps", type=int, default=500)
|
| 396 |
+
add("--save_total_limit", type=int, default=1)
|
| 397 |
+
|
| 398 |
+
parser.set_defaults(resume_from_latest=True)
|
| 399 |
+
add("--resume_from_latest", dest="resume_from_latest", action="store_true")
|
| 400 |
+
add("--no_resume_from_latest", dest="resume_from_latest", action="store_false")
|
| 401 |
+
add("--resume_checkpoint", default=None)
|
| 402 |
+
add("--save_only_model", action="store_true")
|
| 403 |
+
add("--restore_callback_states_from_checkpoint", action="store_true")
|
| 404 |
+
|
| 405 |
+
add("--fp16", action="store_true")
|
| 406 |
+
add("--bf16", action="store_true")
|
| 407 |
+
add("--gradient_checkpointing", action="store_true")
|
| 408 |
+
add("--dataloader_num_workers", type=int, default=0)
|
| 409 |
+
add("--no_pin_memory", action="store_true")
|
| 410 |
+
add("--torch_empty_cache_steps", type=int, default=0)
|
| 411 |
+
add("--device", default="auto", choices=["auto", "cuda", "cpu"])
|
| 412 |
+
add("--min_free_gpu_mb", type=int, default=4096)
|
| 413 |
+
add("--skip_gpu_preflight", action="store_true")
|
| 414 |
+
|
| 415 |
+
add("--use_first_answer_only", action="store_true")
|
| 416 |
+
add("--require_answer_in_context", action="store_true")
|
| 417 |
+
return parser
|
| 418 |
+
|
| 419 |
+
|
| 420 |
+
def load_datasets(args):
|
| 421 |
+
load_kwargs = {
|
| 422 |
+
"use_all_answers": not args.use_first_answer_only,
|
| 423 |
+
"task_prefix": args.task_prefix,
|
| 424 |
+
"require_answer_in_context": args.require_answer_in_context,
|
| 425 |
+
}
|
| 426 |
+
train_examples, train_stats = load_squad_qg_examples(args.train_file, **load_kwargs)
|
| 427 |
+
if not train_examples:
|
| 428 |
+
raise ValueError("Không có dữ liệu train hợp lệ sau khi tiền xử lý.")
|
| 429 |
+
|
| 430 |
+
train_dataset = Dataset.from_list(train_examples)
|
| 431 |
+
val_dataset = None
|
| 432 |
+
val_stats = None
|
| 433 |
+
|
| 434 |
+
if args.validation_file:
|
| 435 |
+
val_examples, val_stats = load_squad_qg_examples(args.validation_file, **load_kwargs)
|
| 436 |
+
if not val_examples:
|
| 437 |
+
raise ValueError("Không có dữ liệu validation hợp lệ sau khi tiền xử lý.")
|
| 438 |
+
val_dataset = Dataset.from_list(val_examples)
|
| 439 |
+
elif args.val_ratio > 0 and len(train_dataset) > 10:
|
| 440 |
+
split = train_dataset.train_test_split(test_size=args.val_ratio, seed=args.seed)
|
| 441 |
+
train_dataset, val_dataset = split["train"], split["test"]
|
| 442 |
+
|
| 443 |
+
return train_dataset, val_dataset, train_stats, val_stats
|
| 444 |
+
|
| 445 |
+
|
| 446 |
+
def tokenize_dataset(dataset, tokenizer, args):
|
| 447 |
+
return dataset.map(
|
| 448 |
+
lambda batch: preprocess_function(batch, tokenizer, args.max_source_length, args.max_target_length),
|
| 449 |
+
batched=True,
|
| 450 |
+
remove_columns=dataset.column_names,
|
| 451 |
+
)
|
| 452 |
+
|
| 453 |
+
|
| 454 |
+
def build_trainer(model, tokenizer, training_args, train_dataset, eval_dataset, args):
|
| 455 |
+
kwargs = {
|
| 456 |
+
"model": model,
|
| 457 |
+
"args": training_args,
|
| 458 |
+
"data_collator": DataCollatorForSeq2Seq(tokenizer=tokenizer, model=model),
|
| 459 |
+
"train_dataset": train_dataset,
|
| 460 |
+
"eval_dataset": eval_dataset,
|
| 461 |
+
"callbacks": [EarlyStoppingCallback(early_stopping_patience=args.early_stopping_patience)]
|
| 462 |
+
if eval_dataset is not None
|
| 463 |
+
else None,
|
| 464 |
+
"processing_class": tokenizer,
|
| 465 |
+
}
|
| 466 |
+
return Seq2SeqTrainer(
|
| 467 |
+
**build_supported_kwargs(Seq2SeqTrainer, kwargs, aliases={"processing_class": "tokenizer"})
|
| 468 |
+
)
|
| 469 |
+
|
| 470 |
+
|
| 471 |
+
def main() -> None:
|
| 472 |
+
args = build_parser().parse_args()
|
| 473 |
+
output_dir = Path(args.output_dir)
|
| 474 |
+
output_dir.mkdir(parents=True, exist_ok=True)
|
| 475 |
+
|
| 476 |
+
set_seed(args.seed)
|
| 477 |
+
ensure_device_ready(args)
|
| 478 |
+
|
| 479 |
+
raw_train_dataset, raw_val_dataset, train_stats, val_stats = load_datasets(args)
|
| 480 |
+
has_eval = raw_val_dataset is not None
|
| 481 |
+
validate_args(args, has_eval)
|
| 482 |
+
|
| 483 |
+
tokenizer = AutoTokenizer.from_pretrained(args.model_name)
|
| 484 |
+
model = AutoModelForSeq2SeqLM.from_pretrained(args.model_name)
|
| 485 |
+
|
| 486 |
+
if args.gradient_checkpointing:
|
| 487 |
+
model.gradient_checkpointing_enable()
|
| 488 |
+
if hasattr(model.config, "use_cache"):
|
| 489 |
+
model.config.use_cache = False
|
| 490 |
+
|
| 491 |
+
tokenized_train = tokenize_dataset(raw_train_dataset, tokenizer, args)
|
| 492 |
+
tokenized_val = tokenize_dataset(raw_val_dataset, tokenizer, args) if has_eval else None
|
| 493 |
+
trainer = build_trainer(
|
| 494 |
+
model,
|
| 495 |
+
tokenizer,
|
| 496 |
+
build_training_args(args, has_eval),
|
| 497 |
+
tokenized_train,
|
| 498 |
+
tokenized_val,
|
| 499 |
+
args,
|
| 500 |
+
)
|
| 501 |
+
|
| 502 |
+
resume_checkpoint = resolve_resume_checkpoint(args)
|
| 503 |
+
try:
|
| 504 |
+
train_result = trainer.train(resume_from_checkpoint=resume_checkpoint)
|
| 505 |
+
except torch.OutOfMemoryError:
|
| 506 |
+
raise_cuda_oom(args)
|
| 507 |
+
except RuntimeError as exc:
|
| 508 |
+
if "CUDA out of memory" in str(exc):
|
| 509 |
+
raise_cuda_oom(args)
|
| 510 |
+
raise
|
| 511 |
+
|
| 512 |
+
trainer.save_state()
|
| 513 |
+
|
| 514 |
+
export_dir = output_dir / ("best-model" if has_eval else "final-model")
|
| 515 |
+
export_dir.mkdir(parents=True, exist_ok=True)
|
| 516 |
+
for path in (export_dir, output_dir):
|
| 517 |
+
trainer.save_model(str(path))
|
| 518 |
+
tokenizer.save_pretrained(str(path))
|
| 519 |
+
|
| 520 |
+
train_metrics = train_result.metrics
|
| 521 |
+
trainer.log_metrics("train", train_metrics)
|
| 522 |
+
trainer.save_metrics("train", train_metrics)
|
| 523 |
+
|
| 524 |
+
eval_metrics = None
|
| 525 |
+
if has_eval:
|
| 526 |
+
eval_metrics = trainer.evaluate(
|
| 527 |
+
max_length=args.max_target_length,
|
| 528 |
+
num_beams=4,
|
| 529 |
+
metric_key_prefix="eval",
|
| 530 |
+
)
|
| 531 |
+
trainer.log_metrics("eval", eval_metrics)
|
| 532 |
+
trainer.save_metrics("eval", eval_metrics)
|
| 533 |
+
|
| 534 |
+
save_json(
|
| 535 |
+
{
|
| 536 |
+
"base_model": args.model_name,
|
| 537 |
+
"task_prefix": args.task_prefix,
|
| 538 |
+
"output_dir": str(output_dir),
|
| 539 |
+
"export_dir": str(export_dir),
|
| 540 |
+
"train_size": len(raw_train_dataset),
|
| 541 |
+
"val_size": len(raw_val_dataset) if raw_val_dataset is not None else 0,
|
| 542 |
+
"train_stats": train_stats,
|
| 543 |
+
"val_stats": val_stats,
|
| 544 |
+
"best_model_checkpoint": trainer.state.best_model_checkpoint,
|
| 545 |
+
"best_metric": trainer.state.best_metric,
|
| 546 |
+
"resumed_from_checkpoint": resume_checkpoint,
|
| 547 |
+
"args": vars(args),
|
| 548 |
+
"train_metrics": train_metrics,
|
| 549 |
+
"eval_metrics": eval_metrics,
|
| 550 |
+
},
|
| 551 |
+
output_dir / "training_summary.json",
|
| 552 |
+
)
|
| 553 |
+
|
| 554 |
+
|
| 555 |
+
if __name__ == "__main__":
|
| 556 |
+
main()
|
HVU_QA/frontend/app.js
ADDED
|
@@ -0,0 +1,1233 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
const STORAGE_KEY = "hvu_qa_history_v1";
|
| 2 |
+
const QUESTION_COUNT_LIMITS = { min: 0, max: 100, default: 0 };
|
| 3 |
+
const SAMPLE_SNIPPETS = [
|
| 4 |
+
{
|
| 5 |
+
id: "luat-giao-duc-dai-hoc",
|
| 6 |
+
title: "Luật Giáo dục đại học",
|
| 7 |
+
preview: "Quy định về cơ sở giáo dục đại học, chương trình đào tạo và quyền tự chủ.",
|
| 8 |
+
text: "Cơ sở giáo dục đại học có nhiệm vụ tổ chức hoạt động đào tạo, nghiên cứu khoa học, hợp tác quốc tế và phục vụ cộng đồng theo quy định của pháp luật. Cơ sở giáo dục đại học được thực hiện quyền tự chủ gắn với trách nhiệm giải trình, bảo đảm chất lượng đào tạo, công khai điều kiện bảo đảm chất lượng và kết quả hoạt động với cơ quan quản lý nhà nước, người học và xã hội.",
|
| 9 |
+
suggestedCount: 5,
|
| 10 |
+
},
|
| 11 |
+
{
|
| 12 |
+
id: "bo-luat-lao-dong",
|
| 13 |
+
title: "Bộ luật Lao động",
|
| 14 |
+
preview: "Nội dung về hợp đồng lao động, quyền và nghĩa vụ của người lao động.",
|
| 15 |
+
text: "Hợp đồng lao động là sự thỏa thuận giữa người lao động và người sử dụng lao động về việc làm có trả công, tiền lương, điều kiện lao động, quyền và nghĩa vụ của mỗi bên. Khi giao kết hợp đồng lao động, các bên phải tuân thủ nguyên tắc tự nguyện, bình đẳng, thiện chí, hợp tác và trung thực; không được thỏa thuận nội dung làm giảm quyền lợi của người lao động so với quy định của pháp luật.",
|
| 16 |
+
suggestedCount: 4,
|
| 17 |
+
},
|
| 18 |
+
{
|
| 19 |
+
id: "luat-an-toan-thong-tin-mang",
|
| 20 |
+
title: "Luật An toàn thông tin mạng",
|
| 21 |
+
preview: "Yêu cầu bảo vệ thông tin, phòng ngừa rủi ro và trách nhiệm của tổ chức, cá nhân.",
|
| 22 |
+
text: "Cơ quan, tổ chức, cá nhân tham gia hoạt động trên môi trường mạng có trách nhiệm bảo đảm an toàn thông tin mạng; áp dụng biện pháp quản lý, kỹ thuật phù hợp để phòng ngừa, phát hiện, ngăn chặn và xử lý nguy cơ mất an toàn thông tin. Việc thu thập, xử lý và sử dụng thông tin trên môi trường mạng phải đúng mục đích, đúng thẩm quyền và bảo đảm quyền, lợi ích hợp pháp của cơ quan, tổ chức, cá nhân có liên quan.",
|
| 23 |
+
suggestedCount: 5,
|
| 24 |
+
},
|
| 25 |
+
];
|
| 26 |
+
const AUTHOR_PROFILES = [
|
| 27 |
+
{
|
| 28 |
+
id: "do-cao-dang",
|
| 29 |
+
name: "Đỗ Cao Đăng",
|
| 30 |
+
role: "Sinh viên thực hiện",
|
| 31 |
+
summary: "Thành viên nhóm thực hiện đề tài.",
|
| 32 |
+
description: "Sinh viên tham gia triển khai và hoàn thiện hệ thống.",
|
| 33 |
+
unit: "Khoa Kỹ thuật - Công nghệ",
|
| 34 |
+
email: "docaodang532001@gmail.com",
|
| 35 |
+
},
|
| 36 |
+
{
|
| 37 |
+
id: "hoang-tuan-ngoc",
|
| 38 |
+
name: "Hoàng Tuấn Ngọc",
|
| 39 |
+
role: "Sinh viên thực hiện",
|
| 40 |
+
summary: "Thành viên nhóm thực hiện đề tài.",
|
| 41 |
+
description: "Sinh viên đồng thực hiện và phối hợp phát triển nội dung cho hệ thống.",
|
| 42 |
+
unit: "Khoa Kỹ thuật - Công nghệ",
|
| 43 |
+
email: "hoangtuanngoc2005@gmail.com",
|
| 44 |
+
},
|
| 45 |
+
{
|
| 46 |
+
id: "nguyen-tien-ha",
|
| 47 |
+
name: "TS. Nguyễn Tiến Hà",
|
| 48 |
+
role: "Giảng viên hướng dẫn",
|
| 49 |
+
summary: "Giảng viên hướng dẫn chuyên môn cho đề tài.",
|
| 50 |
+
description: "Giảng viên hướng dẫn học thuật và định hướng chuyên môn cho đề tài.",
|
| 51 |
+
unit: "Khoa Kỹ thuật - Công nghệ",
|
| 52 |
+
email: "nguyentienha@hvu.edu.vn",
|
| 53 |
+
},
|
| 54 |
+
];
|
| 55 |
+
|
| 56 |
+
const state = {
|
| 57 |
+
history: loadHistory(),
|
| 58 |
+
selectedId: null,
|
| 59 |
+
thread: [],
|
| 60 |
+
availableModels: [],
|
| 61 |
+
activeModelId: "",
|
| 62 |
+
isSwitchingModel: false,
|
| 63 |
+
selectedAuthorId: null,
|
| 64 |
+
isAuthorPanelOpen: false,
|
| 65 |
+
};
|
| 66 |
+
|
| 67 |
+
const voiceState = {
|
| 68 |
+
isSupported: false,
|
| 69 |
+
isListening: false,
|
| 70 |
+
recognition: null,
|
| 71 |
+
discardOnStop: false,
|
| 72 |
+
stopRequested: false,
|
| 73 |
+
baseText: "",
|
| 74 |
+
finalTranscript: "",
|
| 75 |
+
interimTranscript: "",
|
| 76 |
+
lastErrorMessage: "",
|
| 77 |
+
};
|
| 78 |
+
|
| 79 |
+
const elements = {
|
| 80 |
+
menuToggle: document.getElementById("menuToggle"),
|
| 81 |
+
sidebarContent: document.getElementById("sidebarContent"),
|
| 82 |
+
modelSelect: document.getElementById("modelSelect"),
|
| 83 |
+
deviceStatus: document.getElementById("deviceStatus"),
|
| 84 |
+
modelStatus: document.getElementById("modelStatus"),
|
| 85 |
+
historyList: document.getElementById("historyList"),
|
| 86 |
+
clearHistory: document.getElementById("clearHistory"),
|
| 87 |
+
resultCard: document.getElementById("resultCard"),
|
| 88 |
+
form: document.getElementById("generatorForm"),
|
| 89 |
+
sourceShell: document.getElementById("sourceShell"),
|
| 90 |
+
sourceText: document.getElementById("sourceText"),
|
| 91 |
+
voiceInputButton: document.getElementById("voiceInputButton"),
|
| 92 |
+
voiceStatus: document.getElementById("voiceStatus"),
|
| 93 |
+
questionCount: document.getElementById("questionCount"),
|
| 94 |
+
questionCountValue: document.getElementById("questionCountValue"),
|
| 95 |
+
decreaseCount: document.getElementById("decreaseCount"),
|
| 96 |
+
increaseCount: document.getElementById("increaseCount"),
|
| 97 |
+
generateButton: document.getElementById("generateButton"),
|
| 98 |
+
authorToggle: document.getElementById("authorToggle"),
|
| 99 |
+
authorContent: document.getElementById("authorContent"),
|
| 100 |
+
authorList: document.getElementById("authorList"),
|
| 101 |
+
copyrightLine: document.getElementById("copyrightLine"),
|
| 102 |
+
heroTitle: document.getElementById("heroTitle"),
|
| 103 |
+
landingPanel: document.getElementById("landingPanel"),
|
| 104 |
+
sampleList: document.getElementById("sampleList"),
|
| 105 |
+
landingRuntimeBadge: document.getElementById("landingRuntimeBadge"),
|
| 106 |
+
landingStatusText: document.getElementById("landingStatusText"),
|
| 107 |
+
landingModelName: document.getElementById("landingModelName"),
|
| 108 |
+
landingDeviceName: document.getElementById("landingDeviceName"),
|
| 109 |
+
landingModelCount: document.getElementById("landingModelCount"),
|
| 110 |
+
};
|
| 111 |
+
|
| 112 |
+
bootstrap();
|
| 113 |
+
|
| 114 |
+
function bootstrap() {
|
| 115 |
+
hydrateQuestionCount();
|
| 116 |
+
syncSidebar(false);
|
| 117 |
+
renderHistory();
|
| 118 |
+
renderAuthors();
|
| 119 |
+
renderLandingSamples();
|
| 120 |
+
attachEvents();
|
| 121 |
+
hideResultCard();
|
| 122 |
+
activateHeroTitle();
|
| 123 |
+
syncCopyright();
|
| 124 |
+
autoResizeTextarea();
|
| 125 |
+
initVoiceInput();
|
| 126 |
+
syncLandingPanel();
|
| 127 |
+
fetchInfo();
|
| 128 |
+
}
|
| 129 |
+
|
| 130 |
+
function hydrateQuestionCount() {
|
| 131 |
+
syncQuestionCount(elements.questionCount.value);
|
| 132 |
+
}
|
| 133 |
+
|
| 134 |
+
function attachEvents() {
|
| 135 |
+
elements.menuToggle.addEventListener("click", () => {
|
| 136 |
+
syncSidebar(!document.body.classList.contains("sidebar-open"));
|
| 137 |
+
});
|
| 138 |
+
|
| 139 |
+
document.addEventListener("keydown", (event) => {
|
| 140 |
+
if (event.key === "Escape" && document.body.classList.contains("sidebar-open")) {
|
| 141 |
+
syncSidebar(false);
|
| 142 |
+
}
|
| 143 |
+
});
|
| 144 |
+
|
| 145 |
+
elements.clearHistory.addEventListener("click", () => {
|
| 146 |
+
stopVoiceInput(true);
|
| 147 |
+
state.history = [];
|
| 148 |
+
state.selectedId = null;
|
| 149 |
+
state.thread = [];
|
| 150 |
+
persistHistory();
|
| 151 |
+
renderHistory();
|
| 152 |
+
hideResultCard();
|
| 153 |
+
});
|
| 154 |
+
|
| 155 |
+
elements.historyList.addEventListener("click", (event) => {
|
| 156 |
+
const button = event.target.closest("[data-history-id]");
|
| 157 |
+
if (!button) return;
|
| 158 |
+
|
| 159 |
+
stopVoiceInput(true);
|
| 160 |
+
|
| 161 |
+
const entry = state.history.find((item) => item.id === button.dataset.historyId);
|
| 162 |
+
if (!entry) return;
|
| 163 |
+
|
| 164 |
+
state.selectedId = entry.id;
|
| 165 |
+
state.thread = [entry];
|
| 166 |
+
elements.sourceText.value = entry.text || entry.title || "";
|
| 167 |
+
autoResizeTextarea();
|
| 168 |
+
renderHistory();
|
| 169 |
+
renderThread();
|
| 170 |
+
});
|
| 171 |
+
|
| 172 |
+
elements.resultCard.addEventListener("click", async (event) => {
|
| 173 |
+
const button = event.target.closest("[data-copy-entry-id][data-copy-target]");
|
| 174 |
+
if (!button) return;
|
| 175 |
+
|
| 176 |
+
const entryId = button.dataset.copyEntryId;
|
| 177 |
+
const target = button.dataset.copyTarget;
|
| 178 |
+
const textToCopy = buildCopyText(entryId, target);
|
| 179 |
+
if (!textToCopy) return;
|
| 180 |
+
|
| 181 |
+
const copied = await copyToClipboard(textToCopy);
|
| 182 |
+
if (!copied) return;
|
| 183 |
+
|
| 184 |
+
button.classList.add("is-copied");
|
| 185 |
+
button.setAttribute("aria-label", "Đã sao chép");
|
| 186 |
+
window.setTimeout(() => {
|
| 187 |
+
button.classList.remove("is-copied");
|
| 188 |
+
button.setAttribute("aria-label", "Sao chép");
|
| 189 |
+
}, 1400);
|
| 190 |
+
});
|
| 191 |
+
|
| 192 |
+
elements.sourceText.addEventListener("input", () => {
|
| 193 |
+
autoResizeTextarea();
|
| 194 |
+
});
|
| 195 |
+
|
| 196 |
+
elements.sampleList?.addEventListener("click", (event) => {
|
| 197 |
+
const button = event.target.closest("[data-sample-id]");
|
| 198 |
+
if (!button) return;
|
| 199 |
+
|
| 200 |
+
const sample = SAMPLE_SNIPPETS.find((item) => item.id === button.dataset.sampleId);
|
| 201 |
+
if (!sample) return;
|
| 202 |
+
|
| 203 |
+
stopVoiceInput(true);
|
| 204 |
+
elements.sourceText.value = sample.text;
|
| 205 |
+
if (Number(elements.questionCount.value || QUESTION_COUNT_LIMITS.default) <= 0) {
|
| 206 |
+
syncQuestionCount(sample.suggestedCount || 5);
|
| 207 |
+
}
|
| 208 |
+
autoResizeTextarea();
|
| 209 |
+
elements.sourceText.focus();
|
| 210 |
+
});
|
| 211 |
+
|
| 212 |
+
elements.voiceInputButton.addEventListener("click", () => {
|
| 213 |
+
toggleVoiceInput();
|
| 214 |
+
});
|
| 215 |
+
|
| 216 |
+
elements.modelSelect.addEventListener("change", () => {
|
| 217 |
+
const nextModelId = String(elements.modelSelect.value || "").trim();
|
| 218 |
+
if (!nextModelId || nextModelId === state.activeModelId || state.isSwitchingModel) {
|
| 219 |
+
return;
|
| 220 |
+
}
|
| 221 |
+
switchModel(nextModelId);
|
| 222 |
+
});
|
| 223 |
+
|
| 224 |
+
elements.authorToggle?.addEventListener("click", () => {
|
| 225 |
+
state.isAuthorPanelOpen = !state.isAuthorPanelOpen;
|
| 226 |
+
renderAuthors();
|
| 227 |
+
});
|
| 228 |
+
|
| 229 |
+
elements.authorList?.addEventListener("click", (event) => {
|
| 230 |
+
const button = event.target.closest("[data-author-id]");
|
| 231 |
+
if (!button) return;
|
| 232 |
+
selectAuthor(button.dataset.authorId);
|
| 233 |
+
});
|
| 234 |
+
|
| 235 |
+
elements.decreaseCount.addEventListener("click", () => {
|
| 236 |
+
syncQuestionCount(Number(elements.questionCount.value || QUESTION_COUNT_LIMITS.default) - 1);
|
| 237 |
+
});
|
| 238 |
+
|
| 239 |
+
elements.increaseCount.addEventListener("click", () => {
|
| 240 |
+
syncQuestionCount(Number(elements.questionCount.value || QUESTION_COUNT_LIMITS.default) + 1);
|
| 241 |
+
});
|
| 242 |
+
|
| 243 |
+
elements.form.addEventListener("submit", async (event) => {
|
| 244 |
+
event.preventDefault();
|
| 245 |
+
|
| 246 |
+
if (state.isSwitchingModel) {
|
| 247 |
+
renderMessage("Vui lòng chờ chuyển model xong rồi thử lại.");
|
| 248 |
+
return;
|
| 249 |
+
}
|
| 250 |
+
|
| 251 |
+
if (voiceState.isListening) {
|
| 252 |
+
renderMessage("Vui lòng dừng micro trước khi sinh câu hỏi.");
|
| 253 |
+
return;
|
| 254 |
+
}
|
| 255 |
+
|
| 256 |
+
const text = elements.sourceText.value.trim();
|
| 257 |
+
const numQuestions = Number(elements.questionCount.value || String(QUESTION_COUNT_LIMITS.default));
|
| 258 |
+
|
| 259 |
+
if (!text) {
|
| 260 |
+
renderMessage("Vui lòng nhập đoạn văn bản trước khi sinh câu hỏi.");
|
| 261 |
+
elements.sourceText.focus();
|
| 262 |
+
return;
|
| 263 |
+
}
|
| 264 |
+
|
| 265 |
+
if (numQuestions <= 0) {
|
| 266 |
+
renderMessage("Vui lòng tăng số câu hỏi lên ít nhất 1.");
|
| 267 |
+
elements.increaseCount.focus();
|
| 268 |
+
return;
|
| 269 |
+
}
|
| 270 |
+
|
| 271 |
+
const pendingEntry = {
|
| 272 |
+
id: makeId(),
|
| 273 |
+
title: shrink(text, 52),
|
| 274 |
+
text,
|
| 275 |
+
questions: [],
|
| 276 |
+
elapsedMs: null,
|
| 277 |
+
device: null,
|
| 278 |
+
count: numQuestions,
|
| 279 |
+
createdAt: new Date().toISOString(),
|
| 280 |
+
status: "pending",
|
| 281 |
+
errorMessage: "",
|
| 282 |
+
};
|
| 283 |
+
|
| 284 |
+
state.thread = [...state.thread, pendingEntry];
|
| 285 |
+
renderThread();
|
| 286 |
+
elements.sourceText.value = "";
|
| 287 |
+
autoResizeTextarea();
|
| 288 |
+
|
| 289 |
+
setLoading(true);
|
| 290 |
+
|
| 291 |
+
try {
|
| 292 |
+
const response = await fetch("/api/generate", {
|
| 293 |
+
method: "POST",
|
| 294 |
+
headers: { "Content-Type": "application/json" },
|
| 295 |
+
body: JSON.stringify({
|
| 296 |
+
model_id: state.activeModelId || undefined,
|
| 297 |
+
text,
|
| 298 |
+
num_questions: numQuestions,
|
| 299 |
+
}),
|
| 300 |
+
});
|
| 301 |
+
const payload = await response.json();
|
| 302 |
+
|
| 303 |
+
if (!response.ok || !payload.ok) {
|
| 304 |
+
throw new Error(payload.error || "Không thể sinh câu hỏi lúc này.");
|
| 305 |
+
}
|
| 306 |
+
|
| 307 |
+
const entry = {
|
| 308 |
+
id: pendingEntry.id,
|
| 309 |
+
title: shrink(text, 52),
|
| 310 |
+
text: payload.text,
|
| 311 |
+
questions: payload.questions,
|
| 312 |
+
elapsedMs: payload.elapsed_ms,
|
| 313 |
+
device: payload.meta.active_device || payload.meta.predicted_device,
|
| 314 |
+
count: payload.questions?.length || numQuestions,
|
| 315 |
+
createdAt: pendingEntry.createdAt,
|
| 316 |
+
status: "done",
|
| 317 |
+
errorMessage: "",
|
| 318 |
+
};
|
| 319 |
+
|
| 320 |
+
state.selectedId = entry.id;
|
| 321 |
+
state.history = [entry, ...state.history.filter((item) => item.questions?.length)].slice(0, 10);
|
| 322 |
+
state.thread = state.thread.map((item) => (item.id === pendingEntry.id ? entry : item));
|
| 323 |
+
persistHistory();
|
| 324 |
+
renderHistory();
|
| 325 |
+
renderThread();
|
| 326 |
+
} catch (error) {
|
| 327 |
+
state.thread = state.thread.map((item) =>
|
| 328 |
+
item.id === pendingEntry.id
|
| 329 |
+
? {
|
| 330 |
+
...item,
|
| 331 |
+
status: "error",
|
| 332 |
+
errorMessage: error.message || "Có lỗi xảy ra khi sinh câu hỏi.",
|
| 333 |
+
}
|
| 334 |
+
: item,
|
| 335 |
+
);
|
| 336 |
+
renderThread();
|
| 337 |
+
} finally {
|
| 338 |
+
setLoading(false);
|
| 339 |
+
}
|
| 340 |
+
});
|
| 341 |
+
}
|
| 342 |
+
|
| 343 |
+
function syncQuestionCount(value) {
|
| 344 |
+
const parsedValue = Number(value);
|
| 345 |
+
const normalizedValue = Number.isFinite(parsedValue) ? Math.trunc(parsedValue) : QUESTION_COUNT_LIMITS.default;
|
| 346 |
+
const safeValue = Math.min(
|
| 347 |
+
QUESTION_COUNT_LIMITS.max,
|
| 348 |
+
Math.max(QUESTION_COUNT_LIMITS.min, normalizedValue),
|
| 349 |
+
);
|
| 350 |
+
const isLoading = isInterfaceBusy();
|
| 351 |
+
|
| 352 |
+
elements.questionCount.value = String(safeValue);
|
| 353 |
+
elements.questionCountValue.textContent = String(safeValue);
|
| 354 |
+
elements.decreaseCount.disabled = isLoading || safeValue <= QUESTION_COUNT_LIMITS.min;
|
| 355 |
+
elements.increaseCount.disabled = isLoading || safeValue >= QUESTION_COUNT_LIMITS.max;
|
| 356 |
+
}
|
| 357 |
+
|
| 358 |
+
function syncSidebar(isOpen) {
|
| 359 |
+
document.body.classList.toggle("sidebar-open", isOpen);
|
| 360 |
+
elements.menuToggle.setAttribute("aria-expanded", String(isOpen));
|
| 361 |
+
elements.sidebarContent.setAttribute("aria-hidden", String(!isOpen));
|
| 362 |
+
}
|
| 363 |
+
|
| 364 |
+
function initVoiceInput() {
|
| 365 |
+
const SpeechRecognitionApi = window.SpeechRecognition || window.webkitSpeechRecognition;
|
| 366 |
+
|
| 367 |
+
if (!SpeechRecognitionApi) {
|
| 368 |
+
voiceState.isSupported = false;
|
| 369 |
+
syncVoiceUi(buildVoiceUnsupportedMessage(), "error");
|
| 370 |
+
return;
|
| 371 |
+
}
|
| 372 |
+
|
| 373 |
+
if (!canUseBrowserVoiceInput()) {
|
| 374 |
+
voiceState.isSupported = false;
|
| 375 |
+
syncVoiceUi(buildVoiceOriginMessage(), "error");
|
| 376 |
+
return;
|
| 377 |
+
}
|
| 378 |
+
|
| 379 |
+
voiceState.recognition = createSpeechRecognition(SpeechRecognitionApi);
|
| 380 |
+
voiceState.isSupported = true;
|
| 381 |
+
syncVoiceUi("");
|
| 382 |
+
}
|
| 383 |
+
|
| 384 |
+
function toggleVoiceInput() {
|
| 385 |
+
if (!voiceState.isSupported) {
|
| 386 |
+
syncVoiceUi(
|
| 387 |
+
canUseBrowserVoiceInput() ? buildVoiceUnsupportedMessage() : buildVoiceOriginMessage(),
|
| 388 |
+
"error",
|
| 389 |
+
);
|
| 390 |
+
return;
|
| 391 |
+
}
|
| 392 |
+
|
| 393 |
+
if (voiceState.isListening) {
|
| 394 |
+
stopVoiceInput(false);
|
| 395 |
+
return;
|
| 396 |
+
}
|
| 397 |
+
|
| 398 |
+
startVoiceInput();
|
| 399 |
+
}
|
| 400 |
+
|
| 401 |
+
function startVoiceInput() {
|
| 402 |
+
if (!voiceState.recognition) {
|
| 403 |
+
syncVoiceUi(buildVoiceUnsupportedMessage(), "error");
|
| 404 |
+
return;
|
| 405 |
+
}
|
| 406 |
+
|
| 407 |
+
try {
|
| 408 |
+
voiceState.baseText = elements.sourceText.value.trimEnd();
|
| 409 |
+
voiceState.finalTranscript = "";
|
| 410 |
+
voiceState.interimTranscript = "";
|
| 411 |
+
voiceState.discardOnStop = false;
|
| 412 |
+
voiceState.stopRequested = false;
|
| 413 |
+
voiceState.lastErrorMessage = "";
|
| 414 |
+
syncVoiceUi("Đang bật nhận giọng nói...");
|
| 415 |
+
voiceState.recognition.start();
|
| 416 |
+
} catch (error) {
|
| 417 |
+
voiceState.discardOnStop = false;
|
| 418 |
+
syncVoiceUi(humanizeVoiceStartError(error), "error");
|
| 419 |
+
}
|
| 420 |
+
}
|
| 421 |
+
|
| 422 |
+
function stopVoiceInput(discardRecording = false) {
|
| 423 |
+
if (!voiceState.isListening || !voiceState.recognition) {
|
| 424 |
+
return;
|
| 425 |
+
}
|
| 426 |
+
|
| 427 |
+
voiceState.discardOnStop = Boolean(discardRecording);
|
| 428 |
+
voiceState.stopRequested = true;
|
| 429 |
+
|
| 430 |
+
if (discardRecording) {
|
| 431 |
+
voiceState.finalTranscript = "";
|
| 432 |
+
voiceState.interimTranscript = "";
|
| 433 |
+
elements.sourceText.value = voiceState.baseText;
|
| 434 |
+
autoResizeTextarea();
|
| 435 |
+
}
|
| 436 |
+
|
| 437 |
+
syncVoiceUi(discardRecording ? "Đang hủy nhận giọng nói..." : "Đang dừng micro...");
|
| 438 |
+
voiceState.recognition.stop();
|
| 439 |
+
}
|
| 440 |
+
|
| 441 |
+
function createSpeechRecognition(SpeechRecognitionApi) {
|
| 442 |
+
const recognition = new SpeechRecognitionApi();
|
| 443 |
+
recognition.lang = "vi-VN";
|
| 444 |
+
recognition.continuous = true;
|
| 445 |
+
recognition.interimResults = true;
|
| 446 |
+
recognition.maxAlternatives = 1;
|
| 447 |
+
|
| 448 |
+
recognition.onstart = () => {
|
| 449 |
+
voiceState.isListening = true;
|
| 450 |
+
voiceState.stopRequested = false;
|
| 451 |
+
voiceState.lastErrorMessage = "";
|
| 452 |
+
syncVoiceUi("Đang nghe... hãy nói vào micro.");
|
| 453 |
+
};
|
| 454 |
+
|
| 455 |
+
recognition.onresult = (event) => {
|
| 456 |
+
if (voiceState.discardOnStop) {
|
| 457 |
+
return;
|
| 458 |
+
}
|
| 459 |
+
|
| 460 |
+
let nextFinalTranscript = "";
|
| 461 |
+
let nextInterimTranscript = "";
|
| 462 |
+
|
| 463 |
+
for (let index = event.resultIndex; index < event.results.length; index += 1) {
|
| 464 |
+
const transcript = String(event.results[index][0]?.transcript || "").trim();
|
| 465 |
+
if (!transcript) continue;
|
| 466 |
+
|
| 467 |
+
if (event.results[index].isFinal) {
|
| 468 |
+
nextFinalTranscript = appendSpeechChunk(nextFinalTranscript, transcript);
|
| 469 |
+
} else {
|
| 470 |
+
nextInterimTranscript = appendSpeechChunk(nextInterimTranscript, transcript);
|
| 471 |
+
}
|
| 472 |
+
}
|
| 473 |
+
|
| 474 |
+
if (nextFinalTranscript) {
|
| 475 |
+
voiceState.finalTranscript = appendSpeechChunk(voiceState.finalTranscript, nextFinalTranscript);
|
| 476 |
+
}
|
| 477 |
+
|
| 478 |
+
voiceState.interimTranscript = nextInterimTranscript;
|
| 479 |
+
syncSpeechDraft();
|
| 480 |
+
syncVoiceUi("Đang nghe... bấm lại nếu muốn dừng.");
|
| 481 |
+
};
|
| 482 |
+
|
| 483 |
+
recognition.onerror = (event) => {
|
| 484 |
+
console.warn("SpeechRecognition error:", event.error);
|
| 485 |
+
|
| 486 |
+
if (event.error === "aborted" && (voiceState.discardOnStop || voiceState.stopRequested)) {
|
| 487 |
+
voiceState.lastErrorMessage = "";
|
| 488 |
+
return;
|
| 489 |
+
}
|
| 490 |
+
|
| 491 |
+
voiceState.lastErrorMessage = humanizeVoiceRecognitionError(event.error);
|
| 492 |
+
};
|
| 493 |
+
|
| 494 |
+
recognition.onend = () => {
|
| 495 |
+
const shouldDiscard = voiceState.discardOnStop;
|
| 496 |
+
const recognizedText = appendSpeechChunk(voiceState.finalTranscript, voiceState.interimTranscript);
|
| 497 |
+
const hasRecognizedText = Boolean(recognizedText.trim());
|
| 498 |
+
|
| 499 |
+
voiceState.isListening = false;
|
| 500 |
+
|
| 501 |
+
if (!shouldDiscard && hasRecognizedText) {
|
| 502 |
+
elements.sourceText.value = appendSpeechChunk(voiceState.baseText, recognizedText);
|
| 503 |
+
autoResizeTextarea();
|
| 504 |
+
}
|
| 505 |
+
|
| 506 |
+
let finalMessage = "";
|
| 507 |
+
let finalTone = "default";
|
| 508 |
+
|
| 509 |
+
if (voiceState.lastErrorMessage) {
|
| 510 |
+
finalMessage = voiceState.lastErrorMessage;
|
| 511 |
+
finalTone = "error";
|
| 512 |
+
} else if (!shouldDiscard && hasRecognizedText) {
|
| 513 |
+
finalMessage = "Đã chèn nội dung giọng nói vào ô nhập.";
|
| 514 |
+
} else if (!shouldDiscard && !voiceState.stopRequested) {
|
| 515 |
+
finalMessage = "Không nhận diện được nội dung giọng nói. Hãy thử lại.";
|
| 516 |
+
finalTone = "error";
|
| 517 |
+
}
|
| 518 |
+
|
| 519 |
+
voiceState.discardOnStop = false;
|
| 520 |
+
voiceState.stopRequested = false;
|
| 521 |
+
voiceState.finalTranscript = "";
|
| 522 |
+
voiceState.interimTranscript = "";
|
| 523 |
+
voiceState.lastErrorMessage = "";
|
| 524 |
+
voiceState.baseText = elements.sourceText.value.trimEnd();
|
| 525 |
+
|
| 526 |
+
syncVoiceUi(finalMessage, finalTone);
|
| 527 |
+
};
|
| 528 |
+
|
| 529 |
+
return recognition;
|
| 530 |
+
}
|
| 531 |
+
|
| 532 |
+
function appendSpeechChunk(base, chunk) {
|
| 533 |
+
const normalizedBase = String(base || "");
|
| 534 |
+
const normalizedChunk = String(chunk || "").trim();
|
| 535 |
+
|
| 536 |
+
if (!normalizedChunk) return normalizedBase;
|
| 537 |
+
if (!normalizedBase.trim()) return normalizedChunk;
|
| 538 |
+
|
| 539 |
+
return /[\s(]$/.test(normalizedBase) ? `${normalizedBase}${normalizedChunk}` : `${normalizedBase} ${normalizedChunk}`;
|
| 540 |
+
}
|
| 541 |
+
|
| 542 |
+
function humanizeVoiceStartError(error) {
|
| 543 |
+
const errorName = error?.name || "";
|
| 544 |
+
|
| 545 |
+
if (errorName === "InvalidStateError") {
|
| 546 |
+
return "Micro đang hoạt động. Hãy dừng phiên hiện tại trước khi bật lại.";
|
| 547 |
+
}
|
| 548 |
+
|
| 549 |
+
if (errorName === "NotAllowedError" || errorName === "SecurityError") {
|
| 550 |
+
if (!canUseBrowserVoiceInput()) {
|
| 551 |
+
return buildVoiceOriginMessage();
|
| 552 |
+
}
|
| 553 |
+
return "Bạn chưa cấp quyền micro cho trình duyệt.";
|
| 554 |
+
}
|
| 555 |
+
|
| 556 |
+
if (errorName === "NotFoundError" || errorName === "DevicesNotFoundError") {
|
| 557 |
+
return "Không tìm thấy thiết bị micro.";
|
| 558 |
+
}
|
| 559 |
+
|
| 560 |
+
return "Không thể bật nhận giọng nói lúc này. Hãy thử lại.";
|
| 561 |
+
}
|
| 562 |
+
|
| 563 |
+
function humanizeVoiceRecognitionError(errorCode) {
|
| 564 |
+
if (errorCode === "not-allowed" || errorCode === "service-not-allowed") {
|
| 565 |
+
if (!canUseBrowserVoiceInput()) {
|
| 566 |
+
return buildVoiceOriginMessage();
|
| 567 |
+
}
|
| 568 |
+
return "Bạn chưa cấp quyền micro hoặc nhận giọng nói cho trình duyệt.";
|
| 569 |
+
}
|
| 570 |
+
|
| 571 |
+
if (errorCode === "audio-capture") {
|
| 572 |
+
return "Không tìm thấy micro hoặc micro đang bị chiếm dụng.";
|
| 573 |
+
}
|
| 574 |
+
|
| 575 |
+
if (errorCode === "network") {
|
| 576 |
+
if (!navigator.onLine) {
|
| 577 |
+
return "Thiết bị đang offline. Hãy kết nối Internet rồi thử lại.";
|
| 578 |
+
}
|
| 579 |
+
|
| 580 |
+
if (!canUseBrowserVoiceInput()) {
|
| 581 |
+
return buildVoiceOriginMessage();
|
| 582 |
+
}
|
| 583 |
+
|
| 584 |
+
return "Trình duyệt không kết nối được dịch vụ nhận giọng nói. Hãy dùng Chrome hoặc Edge, kiểm tra mạng, VPN hay firewall rồi thử lại.";
|
| 585 |
+
}
|
| 586 |
+
|
| 587 |
+
if (errorCode === "language-not-supported") {
|
| 588 |
+
return "Trình duyệt không hỗ trợ nhận giọng nói tiếng Việt.";
|
| 589 |
+
}
|
| 590 |
+
|
| 591 |
+
if (errorCode === "no-speech") {
|
| 592 |
+
return "Không nghe thấy giọng nói. Hãy thử nói gần micro hơn.";
|
| 593 |
+
}
|
| 594 |
+
|
| 595 |
+
return "Không thể nhận giọng nói lúc này. Hãy thử lại.";
|
| 596 |
+
}
|
| 597 |
+
|
| 598 |
+
function canUseBrowserVoiceInput() {
|
| 599 |
+
return window.isSecureContext || isLocalhost(window.location.hostname);
|
| 600 |
+
}
|
| 601 |
+
|
| 602 |
+
function isLocalhost(hostname) {
|
| 603 |
+
return (
|
| 604 |
+
hostname === "localhost" ||
|
| 605 |
+
hostname.endsWith(".localhost") ||
|
| 606 |
+
hostname === "127.0.0.1" ||
|
| 607 |
+
hostname === "::1" ||
|
| 608 |
+
hostname === "[::1]"
|
| 609 |
+
);
|
| 610 |
+
}
|
| 611 |
+
|
| 612 |
+
function buildVoiceUnsupportedMessage() {
|
| 613 |
+
return "Trình duyệt này chưa hỗ trợ nhập giọng nói trực tiếp. Hãy dùng Chrome hoặc Edge bản mới.";
|
| 614 |
+
}
|
| 615 |
+
|
| 616 |
+
function buildVoiceOriginMessage() {
|
| 617 |
+
return "Nhập bằng giọng nói qua trình duyệt chỉ hoạt động trên HTTPS hoặc localhost. Hãy mở ứng dụng bằng https:// hoặc http://localhost.";
|
| 618 |
+
}
|
| 619 |
+
|
| 620 |
+
function syncSpeechDraft() {
|
| 621 |
+
const stableText = appendSpeechChunk(voiceState.baseText, voiceState.finalTranscript);
|
| 622 |
+
elements.sourceText.value = appendSpeechChunk(stableText, voiceState.interimTranscript);
|
| 623 |
+
autoResizeTextarea();
|
| 624 |
+
}
|
| 625 |
+
|
| 626 |
+
function syncVoiceUi(message, tone = "default") {
|
| 627 |
+
const resolvedMessage = String(message || "").trim();
|
| 628 |
+
|
| 629 |
+
elements.voiceStatus.textContent = resolvedMessage;
|
| 630 |
+
elements.voiceStatus.classList.toggle("is-empty", !resolvedMessage);
|
| 631 |
+
elements.voiceStatus.classList.toggle("is-error", tone === "error");
|
| 632 |
+
elements.voiceStatus.classList.toggle("is-active", voiceState.isListening);
|
| 633 |
+
elements.voiceInputButton.disabled = !voiceState.isSupported;
|
| 634 |
+
elements.voiceInputButton.classList.toggle("is-listening", voiceState.isListening);
|
| 635 |
+
elements.voiceInputButton.classList.toggle("is-unsupported", !voiceState.isSupported);
|
| 636 |
+
elements.voiceInputButton.setAttribute(
|
| 637 |
+
"aria-label",
|
| 638 |
+
voiceState.isListening ? "Dừng nhận giọng nói" : "Nhập bằng giọng nói qua trình duyệt",
|
| 639 |
+
);
|
| 640 |
+
}
|
| 641 |
+
|
| 642 |
+
function autoResizeTextarea() {
|
| 643 |
+
const collapsedHeight = 30;
|
| 644 |
+
const expandedMinHeight = 86;
|
| 645 |
+
const hasContent = elements.sourceText.value.trim().length > 0;
|
| 646 |
+
|
| 647 |
+
elements.sourceText.style.height = `${collapsedHeight}px`;
|
| 648 |
+
const nextHeight = Math.min(elements.sourceText.scrollHeight, 240);
|
| 649 |
+
elements.sourceText.style.height = `${Math.max(nextHeight, hasContent ? expandedMinHeight : collapsedHeight)}px`;
|
| 650 |
+
elements.sourceShell.classList.toggle("is-expanded", hasContent || nextHeight > collapsedHeight + 4);
|
| 651 |
+
}
|
| 652 |
+
|
| 653 |
+
function loadHistory() {
|
| 654 |
+
try {
|
| 655 |
+
const saved = window.localStorage.getItem(STORAGE_KEY);
|
| 656 |
+
if (!saved) {
|
| 657 |
+
return [];
|
| 658 |
+
}
|
| 659 |
+
|
| 660 |
+
const parsed = JSON.parse(saved);
|
| 661 |
+
if (!Array.isArray(parsed)) return [];
|
| 662 |
+
|
| 663 |
+
return parsed
|
| 664 |
+
.map((item) => ({
|
| 665 |
+
...item,
|
| 666 |
+
id: item.id || makeId(),
|
| 667 |
+
createdAt: item.createdAt || new Date().toISOString(),
|
| 668 |
+
}))
|
| 669 |
+
.filter((item) => Array.isArray(item.questions) && item.questions.length > 0);
|
| 670 |
+
} catch {
|
| 671 |
+
return [];
|
| 672 |
+
}
|
| 673 |
+
}
|
| 674 |
+
|
| 675 |
+
function persistHistory() {
|
| 676 |
+
window.localStorage.setItem(STORAGE_KEY, JSON.stringify(state.history));
|
| 677 |
+
}
|
| 678 |
+
|
| 679 |
+
async function fetchInfo() {
|
| 680 |
+
try {
|
| 681 |
+
const response = await fetch("/api/info");
|
| 682 |
+
const payload = await response.json();
|
| 683 |
+
|
| 684 |
+
if (!response.ok || !payload.ok) {
|
| 685 |
+
throw new Error(payload.error || "Không đọc được thông tin hệ thống.");
|
| 686 |
+
}
|
| 687 |
+
|
| 688 |
+
applySystemInfo(payload);
|
| 689 |
+
} catch (error) {
|
| 690 |
+
state.availableModels = [];
|
| 691 |
+
state.activeModelId = "";
|
| 692 |
+
renderModelOptions("Không tải được danh sách model.");
|
| 693 |
+
elements.deviceStatus.textContent = "Không kết nối được backend.";
|
| 694 |
+
elements.modelStatus.textContent = error.message || "Vui lòng kiểm tra lại backend hoặc server Flask.";
|
| 695 |
+
syncLandingStatus({
|
| 696 |
+
modelName: "Không tải được danh sách model",
|
| 697 |
+
deviceName: "Chưa kết nối backend",
|
| 698 |
+
modelCount: 0,
|
| 699 |
+
badgeText: "Lỗi kết nối",
|
| 700 |
+
badgeTone: "error",
|
| 701 |
+
statusText: error.message || "Không thể đọc thông tin hệ thống từ backend.",
|
| 702 |
+
});
|
| 703 |
+
}
|
| 704 |
+
}
|
| 705 |
+
|
| 706 |
+
async function switchModel(modelId) {
|
| 707 |
+
const previousModelId = state.activeModelId;
|
| 708 |
+
|
| 709 |
+
state.isSwitchingModel = true;
|
| 710 |
+
syncInteractiveControls();
|
| 711 |
+
elements.modelStatus.textContent = "Đang chuyển model...";
|
| 712 |
+
syncLandingStatus({
|
| 713 |
+
modelName: state.availableModels.find((item) => item.id === modelId)?.label || "Đang chuyển model...",
|
| 714 |
+
deviceName: elements.deviceStatus.textContent || "Đang kiểm tra...",
|
| 715 |
+
modelCount: state.availableModels.length,
|
| 716 |
+
badgeText: "Đang chuyển",
|
| 717 |
+
badgeTone: "pending",
|
| 718 |
+
statusText: "Hệ thống đang chuyển model theo lựa chọn của bạn.",
|
| 719 |
+
});
|
| 720 |
+
|
| 721 |
+
try {
|
| 722 |
+
const response = await fetch("/api/model", {
|
| 723 |
+
method: "POST",
|
| 724 |
+
headers: { "Content-Type": "application/json" },
|
| 725 |
+
body: JSON.stringify({ model_id: modelId }),
|
| 726 |
+
});
|
| 727 |
+
const payload = await response.json();
|
| 728 |
+
|
| 729 |
+
if (!response.ok || !payload.ok) {
|
| 730 |
+
throw new Error(payload.error || "Không thể chuyển model lúc này.");
|
| 731 |
+
}
|
| 732 |
+
|
| 733 |
+
applySystemInfo(payload);
|
| 734 |
+
} catch (error) {
|
| 735 |
+
state.activeModelId = previousModelId;
|
| 736 |
+
renderModelOptions();
|
| 737 |
+
elements.modelStatus.textContent = error.message || "Không thể chuyển model lúc này.";
|
| 738 |
+
syncLandingStatus({
|
| 739 |
+
modelName:
|
| 740 |
+
state.availableModels.find((item) => item.id === previousModelId)?.label
|
| 741 |
+
|| state.availableModels[0]?.label
|
| 742 |
+
|| "Không xác định",
|
| 743 |
+
deviceName: elements.deviceStatus.textContent || "Đang kiểm tra...",
|
| 744 |
+
modelCount: state.availableModels.length,
|
| 745 |
+
badgeText: "Lỗi chuyển model",
|
| 746 |
+
badgeTone: "error",
|
| 747 |
+
statusText: error.message || "Không thể chuyển model lúc này.",
|
| 748 |
+
});
|
| 749 |
+
} finally {
|
| 750 |
+
state.isSwitchingModel = false;
|
| 751 |
+
syncInteractiveControls();
|
| 752 |
+
}
|
| 753 |
+
}
|
| 754 |
+
|
| 755 |
+
function applySystemInfo(payload) {
|
| 756 |
+
const availableModels = Array.isArray(payload.available_models) ? payload.available_models : [];
|
| 757 |
+
const fallbackModelId = String(payload.selected_model_id || payload.model_name || "default-model");
|
| 758 |
+
const fallbackModelLabel = String(payload.model_name || "Model hiện tại");
|
| 759 |
+
|
| 760 |
+
state.availableModels = availableModels.length
|
| 761 |
+
? availableModels
|
| 762 |
+
: [{ id: fallbackModelId, label: fallbackModelLabel }];
|
| 763 |
+
state.activeModelId = String(payload.selected_model_id || state.availableModels[0]?.id || "");
|
| 764 |
+
renderModelOptions();
|
| 765 |
+
|
| 766 |
+
const activeModel =
|
| 767 |
+
state.availableModels.find((item) => item.id === state.activeModelId)?.label || payload.model_name || "Model";
|
| 768 |
+
|
| 769 |
+
elements.deviceStatus.textContent = humanizeDevice(payload.meta.active_device || payload.meta.predicted_device);
|
| 770 |
+
elements.modelStatus.textContent = payload.meta.loaded
|
| 771 |
+
? `${activeModel} đã sẵn sàng cho tác vụ sinh câu hỏi.`
|
| 772 |
+
: `Đã chọn ${activeModel}. Model sẽ được nạp tự động ở lần sinh câu hỏi đầu tiên.`;
|
| 773 |
+
|
| 774 |
+
syncLandingStatus({
|
| 775 |
+
modelName: activeModel,
|
| 776 |
+
deviceName: humanizeDeviceCompact(payload.meta.active_device || payload.meta.predicted_device),
|
| 777 |
+
modelCount: state.availableModels.length,
|
| 778 |
+
badgeText: payload.meta.loaded ? "Sẵn sàng" : "Chờ nạp model",
|
| 779 |
+
badgeTone: payload.meta.loaded ? "ready" : "pending",
|
| 780 |
+
statusText: payload.meta.loaded
|
| 781 |
+
? `${activeModel} đã nạp xong và có thể sử dụng ngay.`
|
| 782 |
+
: `${activeModel} sẽ được nạp tự động ở lần sinh câu hỏi đầu tiên.`,
|
| 783 |
+
});
|
| 784 |
+
}
|
| 785 |
+
|
| 786 |
+
function renderModelOptions(emptyLabel = "Chưa có model khả dụng.") {
|
| 787 |
+
elements.modelSelect.innerHTML = "";
|
| 788 |
+
|
| 789 |
+
if (!state.availableModels.length) {
|
| 790 |
+
const fallbackOption = document.createElement("option");
|
| 791 |
+
fallbackOption.value = "";
|
| 792 |
+
fallbackOption.textContent = emptyLabel;
|
| 793 |
+
elements.modelSelect.appendChild(fallbackOption);
|
| 794 |
+
elements.modelSelect.disabled = true;
|
| 795 |
+
return;
|
| 796 |
+
}
|
| 797 |
+
|
| 798 |
+
for (const model of state.availableModels) {
|
| 799 |
+
const option = document.createElement("option");
|
| 800 |
+
option.value = model.id;
|
| 801 |
+
option.textContent = model.label;
|
| 802 |
+
elements.modelSelect.appendChild(option);
|
| 803 |
+
}
|
| 804 |
+
|
| 805 |
+
if (!state.availableModels.some((item) => item.id === state.activeModelId)) {
|
| 806 |
+
state.activeModelId = state.availableModels[0].id;
|
| 807 |
+
}
|
| 808 |
+
|
| 809 |
+
elements.modelSelect.value = state.activeModelId;
|
| 810 |
+
syncInteractiveControls();
|
| 811 |
+
}
|
| 812 |
+
|
| 813 |
+
function renderAuthors() {
|
| 814 |
+
if (!elements.authorToggle || !elements.authorContent || !elements.authorList) return;
|
| 815 |
+
|
| 816 |
+
elements.authorToggle.setAttribute("aria-expanded", String(state.isAuthorPanelOpen));
|
| 817 |
+
elements.authorToggle.classList.toggle("is-open", state.isAuthorPanelOpen);
|
| 818 |
+
elements.authorContent.hidden = !state.isAuthorPanelOpen;
|
| 819 |
+
|
| 820 |
+
if (!state.isAuthorPanelOpen) {
|
| 821 |
+
elements.authorList.innerHTML = "";
|
| 822 |
+
return;
|
| 823 |
+
}
|
| 824 |
+
|
| 825 |
+
elements.authorList.innerHTML = AUTHOR_PROFILES.map((author) => {
|
| 826 |
+
const isActive = author.id === state.selectedAuthorId;
|
| 827 |
+
const detailMarkup = isActive ? buildAuthorDetailMarkup(author) : "";
|
| 828 |
+
return `
|
| 829 |
+
<article
|
| 830 |
+
class="author-person ${isActive ? "is-active" : ""}"
|
| 831 |
+
data-author-shell="${escapeHtml(author.id)}"
|
| 832 |
+
>
|
| 833 |
+
<button
|
| 834 |
+
class="author-person-trigger"
|
| 835 |
+
type="button"
|
| 836 |
+
data-author-id="${escapeHtml(author.id)}"
|
| 837 |
+
aria-expanded="${String(isActive)}"
|
| 838 |
+
>
|
| 839 |
+
<div class="author-person-top">
|
| 840 |
+
<span class="author-person-role">${escapeHtml(author.role)}</span>
|
| 841 |
+
<span class="author-person-chevron" aria-hidden="true">
|
| 842 |
+
<svg viewBox="0 0 24 24" fill="none">
|
| 843 |
+
<path
|
| 844 |
+
d="m8 10 4 4 4-4"
|
| 845 |
+
stroke="currentColor"
|
| 846 |
+
stroke-linecap="round"
|
| 847 |
+
stroke-linejoin="round"
|
| 848 |
+
stroke-width="1.8"
|
| 849 |
+
/>
|
| 850 |
+
</svg>
|
| 851 |
+
</span>
|
| 852 |
+
</div>
|
| 853 |
+
<strong>${escapeHtml(author.name)}</strong>
|
| 854 |
+
<span class="author-person-summary">${escapeHtml(author.summary)}</span>
|
| 855 |
+
</button>
|
| 856 |
+
${detailMarkup}
|
| 857 |
+
</article>
|
| 858 |
+
`;
|
| 859 |
+
}).join("");
|
| 860 |
+
}
|
| 861 |
+
|
| 862 |
+
function renderLandingSamples() {
|
| 863 |
+
if (!elements.sampleList) return;
|
| 864 |
+
|
| 865 |
+
elements.sampleList.innerHTML = SAMPLE_SNIPPETS.map((sample) => `
|
| 866 |
+
<button class="sample-card" type="button" data-sample-id="${escapeHtml(sample.id)}">
|
| 867 |
+
<strong>${escapeHtml(sample.title)}</strong>
|
| 868 |
+
<span>${escapeHtml(sample.preview)}</span>
|
| 869 |
+
</button>
|
| 870 |
+
`).join("");
|
| 871 |
+
}
|
| 872 |
+
|
| 873 |
+
function syncLandingPanel() {
|
| 874 |
+
if (!elements.landingPanel) return;
|
| 875 |
+
elements.landingPanel.hidden = !elements.resultCard.hidden;
|
| 876 |
+
}
|
| 877 |
+
|
| 878 |
+
function syncLandingStatus({
|
| 879 |
+
modelName = "Đang tải...",
|
| 880 |
+
deviceName = "Đang kiểm tra...",
|
| 881 |
+
modelCount = 0,
|
| 882 |
+
badgeText = "Đang kiểm tra",
|
| 883 |
+
badgeTone = "pending",
|
| 884 |
+
statusText = "Đang đồng bộ trạng thái hệ thống.",
|
| 885 |
+
} = {}) {
|
| 886 |
+
if (
|
| 887 |
+
!elements.landingRuntimeBadge
|
| 888 |
+
|| !elements.landingStatusText
|
| 889 |
+
|| !elements.landingModelName
|
| 890 |
+
|| !elements.landingDeviceName
|
| 891 |
+
|| !elements.landingModelCount
|
| 892 |
+
) {
|
| 893 |
+
return;
|
| 894 |
+
}
|
| 895 |
+
|
| 896 |
+
elements.landingRuntimeBadge.textContent = badgeText;
|
| 897 |
+
elements.landingRuntimeBadge.classList.remove("is-ready", "is-pending", "is-error");
|
| 898 |
+
elements.landingRuntimeBadge.classList.add(
|
| 899 |
+
badgeTone === "ready" ? "is-ready" : badgeTone === "error" ? "is-error" : "is-pending",
|
| 900 |
+
);
|
| 901 |
+
elements.landingStatusText.textContent = statusText;
|
| 902 |
+
elements.landingModelName.textContent = modelName;
|
| 903 |
+
elements.landingDeviceName.textContent = deviceName;
|
| 904 |
+
elements.landingModelCount.textContent = String(modelCount);
|
| 905 |
+
}
|
| 906 |
+
|
| 907 |
+
function selectAuthor(authorId) {
|
| 908 |
+
if (!AUTHOR_PROFILES.some((author) => author.id === authorId)) {
|
| 909 |
+
return;
|
| 910 |
+
}
|
| 911 |
+
|
| 912 |
+
state.selectedAuthorId = state.selectedAuthorId === authorId ? null : authorId;
|
| 913 |
+
renderAuthors();
|
| 914 |
+
}
|
| 915 |
+
|
| 916 |
+
function buildAuthorDetailMarkup(author) {
|
| 917 |
+
const metaItems = [
|
| 918 |
+
{ label: "Vai trò", value: author.description },
|
| 919 |
+
{ label: "Đơn vị", value: author.unit },
|
| 920 |
+
...(author.email ? [{ label: "Email", value: author.email }] : []),
|
| 921 |
+
];
|
| 922 |
+
|
| 923 |
+
return `
|
| 924 |
+
<div class="author-person-body">
|
| 925 |
+
<div class="author-person-meta">
|
| 926 |
+
${metaItems
|
| 927 |
+
.map(
|
| 928 |
+
(item) => `
|
| 929 |
+
<div class="author-person-meta-row" title="${escapeHtml(item.value)}">
|
| 930 |
+
<span class="author-person-meta-label">${escapeHtml(item.label)}</span>
|
| 931 |
+
<span class="author-person-meta-value">${escapeHtml(item.value)}</span>
|
| 932 |
+
</div>
|
| 933 |
+
`,
|
| 934 |
+
)
|
| 935 |
+
.join("")}
|
| 936 |
+
</div>
|
| 937 |
+
</div>
|
| 938 |
+
`;
|
| 939 |
+
}
|
| 940 |
+
|
| 941 |
+
function humanizeDevice(device) {
|
| 942 |
+
if (device === "cuda") return "Đang sử dụng GPU CUDA.";
|
| 943 |
+
return "Đang sử dụng CPU.";
|
| 944 |
+
}
|
| 945 |
+
|
| 946 |
+
function humanizeDeviceCompact(device) {
|
| 947 |
+
if (device === "cuda") return "GPU CUDA";
|
| 948 |
+
return "CPU";
|
| 949 |
+
}
|
| 950 |
+
|
| 951 |
+
function renderHistory() {
|
| 952 |
+
if (!state.history.length) {
|
| 953 |
+
elements.historyList.innerHTML = '<div class="history-empty">Chưa có lịch sử. Hãy tạo bộ câu hỏi đầu tiên của bạn.</div>';
|
| 954 |
+
return;
|
| 955 |
+
}
|
| 956 |
+
|
| 957 |
+
elements.historyList.innerHTML = state.history
|
| 958 |
+
.map((item) => {
|
| 959 |
+
const activeClass = item.id === state.selectedId ? "is-active" : "";
|
| 960 |
+
return `
|
| 961 |
+
<button class="history-item ${activeClass}" type="button" data-history-id="${item.id}">
|
| 962 |
+
<span class="history-icon" aria-hidden="true">
|
| 963 |
+
<svg viewBox="0 0 24 24" width="18" height="18" fill="none">
|
| 964 |
+
<path d="M12 20c4.4 0 8-2.9 8-6.5S16.4 7 12 7 4 9.9 4 13.5c0 1.6.7 3 1.9 4.1L5 21l3.2-1.6c1.1.4 2.4.6 3.8.6Z" stroke="currentColor" stroke-width="1.6" stroke-linejoin="round"/>
|
| 965 |
+
</svg>
|
| 966 |
+
</span>
|
| 967 |
+
<span class="history-main">
|
| 968 |
+
<strong>${escapeHtml(item.title || "Đoạn văn mới")}</strong>
|
| 969 |
+
<span>${escapeHtml(formatTimestamp(item.createdAt))}</span>
|
| 970 |
+
</span>
|
| 971 |
+
</button>
|
| 972 |
+
`;
|
| 973 |
+
})
|
| 974 |
+
.join("");
|
| 975 |
+
}
|
| 976 |
+
|
| 977 |
+
function showResultCard() {
|
| 978 |
+
elements.resultCard.hidden = false;
|
| 979 |
+
elements.resultCard.classList.add("is-visible");
|
| 980 |
+
syncLandingPanel();
|
| 981 |
+
}
|
| 982 |
+
|
| 983 |
+
function hideResultCard() {
|
| 984 |
+
elements.resultCard.hidden = true;
|
| 985 |
+
elements.resultCard.classList.remove("is-visible");
|
| 986 |
+
elements.resultCard.classList.remove("has-entry");
|
| 987 |
+
elements.resultCard.classList.remove("is-updating");
|
| 988 |
+
elements.resultCard.innerHTML = "";
|
| 989 |
+
syncLandingPanel();
|
| 990 |
+
}
|
| 991 |
+
|
| 992 |
+
function renderMessage(message) {
|
| 993 |
+
showResultCard();
|
| 994 |
+
elements.resultCard.classList.remove("is-updating");
|
| 995 |
+
|
| 996 |
+
if (state.thread.length) {
|
| 997 |
+
elements.resultCard.classList.add("has-entry");
|
| 998 |
+
elements.resultCard.innerHTML = `
|
| 999 |
+
<p class="result-message result-message-inline">${escapeHtml(message)}</p>
|
| 1000 |
+
${renderThreadMarkup(state.thread)}
|
| 1001 |
+
`;
|
| 1002 |
+
return;
|
| 1003 |
+
}
|
| 1004 |
+
|
| 1005 |
+
elements.resultCard.classList.remove("has-entry");
|
| 1006 |
+
elements.resultCard.innerHTML = `
|
| 1007 |
+
<p class="result-message">${escapeHtml(message)}</p>
|
| 1008 |
+
`;
|
| 1009 |
+
}
|
| 1010 |
+
|
| 1011 |
+
function renderThread() {
|
| 1012 |
+
if (!state.thread.length) {
|
| 1013 |
+
hideResultCard();
|
| 1014 |
+
return;
|
| 1015 |
+
}
|
| 1016 |
+
|
| 1017 |
+
showResultCard();
|
| 1018 |
+
elements.resultCard.classList.add("has-entry");
|
| 1019 |
+
elements.resultCard.classList.remove("is-updating");
|
| 1020 |
+
elements.resultCard.innerHTML = renderThreadMarkup(state.thread);
|
| 1021 |
+
}
|
| 1022 |
+
|
| 1023 |
+
function renderThreadMarkup(entries) {
|
| 1024 |
+
return `
|
| 1025 |
+
<div class="result-feed">
|
| 1026 |
+
${entries.map((entry) => renderThreadItem(entry)).join("")}
|
| 1027 |
+
</div>
|
| 1028 |
+
`;
|
| 1029 |
+
}
|
| 1030 |
+
|
| 1031 |
+
function renderThreadItem(entry) {
|
| 1032 |
+
const questions = Array.isArray(entry.questions) ? entry.questions : [];
|
| 1033 |
+
const questionItems = questions.map((item) => `<li>${escapeHtml(item)}</li>`).join("");
|
| 1034 |
+
const statusLabel =
|
| 1035 |
+
entry.status === "pending" ? "ĐANG XỬ LÝ" : entry.status === "error" ? "LỖI" : entry.device ? entry.device.toUpperCase() : "AUTO";
|
| 1036 |
+
const questionBlock =
|
| 1037 |
+
entry.status === "pending"
|
| 1038 |
+
? `
|
| 1039 |
+
<div class="result-pending">
|
| 1040 |
+
<div class="atom-loader atom-loader-inline" aria-hidden="true">
|
| 1041 |
+
<span class="atom-core"></span>
|
| 1042 |
+
<span class="atom-orbit atom-orbit-a"><span class="atom-electron"></span></span>
|
| 1043 |
+
<span class="atom-orbit atom-orbit-b"><span class="atom-electron"></span></span>
|
| 1044 |
+
<span class="atom-orbit atom-orbit-c"><span class="atom-electron"></span></span>
|
| 1045 |
+
</div>
|
| 1046 |
+
<p class="result-note">Đang sinh câu hỏi từ đoạn văn bản này...</p>
|
| 1047 |
+
</div>
|
| 1048 |
+
`
|
| 1049 |
+
: entry.status === "error"
|
| 1050 |
+
? `<p class="result-message">${escapeHtml(entry.errorMessage || "Có lỗi xảy ra khi sinh câu hỏi.")}</p>`
|
| 1051 |
+
: questions.length
|
| 1052 |
+
? `<ol class="result-questions">${questionItems}</ol>`
|
| 1053 |
+
: '<p class="result-note">Mục lịch sử này chưa lưu danh sách câu hỏi. Hãy bấm “Sinh câu hỏi” để tạo lại.</p>';
|
| 1054 |
+
|
| 1055 |
+
return `
|
| 1056 |
+
<article class="result-thread-item" data-entry-id="${escapeHtml(entry.id)}">
|
| 1057 |
+
<div class="result-meta">
|
| 1058 |
+
<span>${escapeHtml(formatTimestamp(entry.createdAt))}</span>
|
| 1059 |
+
<span>${escapeHtml(statusLabel)}</span>
|
| 1060 |
+
<span>${escapeHtml(String(entry.count || questions.length || 0))} câu hỏi</span>
|
| 1061 |
+
${entry.elapsedMs ? `<span>${escapeHtml(String(entry.elapsedMs))} ms</span>` : ""}
|
| 1062 |
+
</div>
|
| 1063 |
+
|
| 1064 |
+
<section class="result-section">
|
| 1065 |
+
<div class="result-section-head">
|
| 1066 |
+
<h3 class="result-source-title">Văn bản đầu vào</h3>
|
| 1067 |
+
<button class="copy-button" type="button" data-copy-entry-id="${escapeHtml(entry.id)}" data-copy-target="source" aria-label="Sao chép">
|
| 1068 |
+
${copyIconMarkup()}
|
| 1069 |
+
</button>
|
| 1070 |
+
</div>
|
| 1071 |
+
<p class="result-source">${escapeHtml(entry.text || entry.title || "")}</p>
|
| 1072 |
+
</section>
|
| 1073 |
+
|
| 1074 |
+
<section class="result-section">
|
| 1075 |
+
<div class="result-section-head">
|
| 1076 |
+
<h3 class="result-questions-title">Câu hỏi sinh ra</h3>
|
| 1077 |
+
<button class="copy-button" type="button" data-copy-entry-id="${escapeHtml(entry.id)}" data-copy-target="response" aria-label="Sao chép">
|
| 1078 |
+
${copyIconMarkup()}
|
| 1079 |
+
</button>
|
| 1080 |
+
</div>
|
| 1081 |
+
${questionBlock}
|
| 1082 |
+
</section>
|
| 1083 |
+
</article>
|
| 1084 |
+
`;
|
| 1085 |
+
}
|
| 1086 |
+
|
| 1087 |
+
function copyIconMarkup() {
|
| 1088 |
+
return `
|
| 1089 |
+
<svg viewBox="0 0 24 24" fill="none" aria-hidden="true">
|
| 1090 |
+
<path
|
| 1091 |
+
d="M9 9.75A2.25 2.25 0 0 1 11.25 7.5h6A2.25 2.25 0 0 1 19.5 9.75v6A2.25 2.25 0 0 1 17.25 18h-6A2.25 2.25 0 0 1 9 15.75v-6Z"
|
| 1092 |
+
stroke="currentColor"
|
| 1093 |
+
stroke-linecap="round"
|
| 1094 |
+
stroke-linejoin="round"
|
| 1095 |
+
stroke-width="1.7"
|
| 1096 |
+
/>
|
| 1097 |
+
<path
|
| 1098 |
+
d="M15 7.5v-.75A2.25 2.25 0 0 0 12.75 4.5h-6A2.25 2.25 0 0 0 4.5 6.75v6A2.25 2.25 0 0 0 6.75 15H9"
|
| 1099 |
+
stroke="currentColor"
|
| 1100 |
+
stroke-linecap="round"
|
| 1101 |
+
stroke-linejoin="round"
|
| 1102 |
+
stroke-width="1.7"
|
| 1103 |
+
/>
|
| 1104 |
+
</svg>
|
| 1105 |
+
`;
|
| 1106 |
+
}
|
| 1107 |
+
|
| 1108 |
+
function buildCopyText(entryId, target) {
|
| 1109 |
+
const entry = state.thread.find((item) => item.id === entryId) || state.history.find((item) => item.id === entryId);
|
| 1110 |
+
if (!entry) return "";
|
| 1111 |
+
|
| 1112 |
+
if (target === "source") {
|
| 1113 |
+
return String(entry.text || entry.title || "").trim();
|
| 1114 |
+
}
|
| 1115 |
+
|
| 1116 |
+
if (entry.status === "pending") {
|
| 1117 |
+
return "Đang sinh câu hỏi từ đoạn văn bản này...";
|
| 1118 |
+
}
|
| 1119 |
+
|
| 1120 |
+
if (entry.status === "error") {
|
| 1121 |
+
return String(entry.errorMessage || "Có lỗi xảy ra khi sinh câu hỏi.").trim();
|
| 1122 |
+
}
|
| 1123 |
+
|
| 1124 |
+
const questions = Array.isArray(entry.questions) ? entry.questions : [];
|
| 1125 |
+
if (questions.length) {
|
| 1126 |
+
return questions.map((item, index) => `${index + 1}. ${item}`).join("\n");
|
| 1127 |
+
}
|
| 1128 |
+
|
| 1129 |
+
return "Mục lịch sử này chưa lưu danh sách câu hỏi. Hãy bấm “Sinh câu hỏi” để tạo lại.";
|
| 1130 |
+
}
|
| 1131 |
+
|
| 1132 |
+
async function copyToClipboard(text) {
|
| 1133 |
+
try {
|
| 1134 |
+
await navigator.clipboard.writeText(text);
|
| 1135 |
+
return true;
|
| 1136 |
+
} catch {
|
| 1137 |
+
try {
|
| 1138 |
+
const area = document.createElement("textarea");
|
| 1139 |
+
area.value = text;
|
| 1140 |
+
area.setAttribute("readonly", "");
|
| 1141 |
+
area.style.position = "fixed";
|
| 1142 |
+
area.style.opacity = "0";
|
| 1143 |
+
document.body.appendChild(area);
|
| 1144 |
+
area.select();
|
| 1145 |
+
const success = document.execCommand("copy");
|
| 1146 |
+
area.remove();
|
| 1147 |
+
return success;
|
| 1148 |
+
} catch {
|
| 1149 |
+
return false;
|
| 1150 |
+
}
|
| 1151 |
+
}
|
| 1152 |
+
}
|
| 1153 |
+
|
| 1154 |
+
function setLoading(isLoading) {
|
| 1155 |
+
elements.generateButton.classList.toggle("is-loading", isLoading);
|
| 1156 |
+
document.body.classList.toggle("is-generating", isLoading);
|
| 1157 |
+
elements.resultCard.classList.toggle("is-updating", isLoading && elements.resultCard.classList.contains("has-entry"));
|
| 1158 |
+
syncQuestionCount(elements.questionCount.value);
|
| 1159 |
+
syncInteractiveControls();
|
| 1160 |
+
}
|
| 1161 |
+
|
| 1162 |
+
function isInterfaceBusy() {
|
| 1163 |
+
return document.body.classList.contains("is-generating") || state.isSwitchingModel;
|
| 1164 |
+
}
|
| 1165 |
+
|
| 1166 |
+
function syncInteractiveControls() {
|
| 1167 |
+
const isBusy = isInterfaceBusy();
|
| 1168 |
+
elements.generateButton.disabled = isBusy;
|
| 1169 |
+
elements.modelSelect.disabled = isBusy || state.availableModels.length <= 1;
|
| 1170 |
+
}
|
| 1171 |
+
|
| 1172 |
+
function activateHeroTitle() {
|
| 1173 |
+
const node = elements.heroTitle;
|
| 1174 |
+
if (!node) return;
|
| 1175 |
+
|
| 1176 |
+
const fullText = String(node.dataset.text || "").trim();
|
| 1177 |
+
if (!fullText) return;
|
| 1178 |
+
|
| 1179 |
+
node.textContent = fullText;
|
| 1180 |
+
node.classList.remove("is-ready");
|
| 1181 |
+
|
| 1182 |
+
if (window.matchMedia("(prefers-reduced-motion: reduce)").matches) {
|
| 1183 |
+
node.classList.add("is-ready");
|
| 1184 |
+
return;
|
| 1185 |
+
}
|
| 1186 |
+
|
| 1187 |
+
window.requestAnimationFrame(() => {
|
| 1188 |
+
node.classList.add("is-ready");
|
| 1189 |
+
});
|
| 1190 |
+
}
|
| 1191 |
+
|
| 1192 |
+
function shrink(text, maxLength) {
|
| 1193 |
+
const normalized = String(text || "").trim();
|
| 1194 |
+
if (normalized.length <= maxLength) return normalized;
|
| 1195 |
+
return `${normalized.slice(0, maxLength - 3).trim()}...`;
|
| 1196 |
+
}
|
| 1197 |
+
|
| 1198 |
+
function formatTimestamp(value) {
|
| 1199 |
+
const date = new Date(value);
|
| 1200 |
+
if (Number.isNaN(date.getTime())) return "Vừa xong";
|
| 1201 |
+
|
| 1202 |
+
const now = new Date();
|
| 1203 |
+
const sameDay = date.toDateString() === now.toDateString();
|
| 1204 |
+
const yesterday = new Date(now);
|
| 1205 |
+
yesterday.setDate(now.getDate() - 1);
|
| 1206 |
+
|
| 1207 |
+
const timeLabel = date.toLocaleTimeString("vi-VN", { hour: "2-digit", minute: "2-digit" });
|
| 1208 |
+
if (sameDay) return `Hôm nay ${timeLabel}`;
|
| 1209 |
+
if (date.toDateString() === yesterday.toDateString()) return `Hôm qua ${timeLabel}`;
|
| 1210 |
+
|
| 1211 |
+
return date.toLocaleDateString("vi-VN", { day: "2-digit", month: "2-digit", year: "numeric" });
|
| 1212 |
+
}
|
| 1213 |
+
|
| 1214 |
+
function syncCopyright() {
|
| 1215 |
+
const year = new Date().getFullYear();
|
| 1216 |
+
elements.copyrightLine.textContent = `© ${year} HVU - KTCN`;
|
| 1217 |
+
}
|
| 1218 |
+
|
| 1219 |
+
function escapeHtml(value) {
|
| 1220 |
+
return String(value || "")
|
| 1221 |
+
.replaceAll("&", "&")
|
| 1222 |
+
.replaceAll("<", "<")
|
| 1223 |
+
.replaceAll(">", ">")
|
| 1224 |
+
.replaceAll('"', """)
|
| 1225 |
+
.replaceAll("'", "'");
|
| 1226 |
+
}
|
| 1227 |
+
|
| 1228 |
+
function makeId() {
|
| 1229 |
+
if (window.crypto && typeof window.crypto.randomUUID === "function") {
|
| 1230 |
+
return window.crypto.randomUUID();
|
| 1231 |
+
}
|
| 1232 |
+
return `item-${Date.now()}-${Math.random().toString(16).slice(2)}`;
|
| 1233 |
+
}
|
HVU_QA/frontend/index.html
ADDED
|
@@ -0,0 +1,265 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
<!DOCTYPE html>
|
| 2 |
+
<html lang="vi">
|
| 3 |
+
<head>
|
| 4 |
+
<meta charset="UTF-8" />
|
| 5 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
| 6 |
+
<title>HVU QA - Mô hình sinh câu hỏi thường gặp</title>
|
| 7 |
+
<meta
|
| 8 |
+
name="description"
|
| 9 |
+
content="Hệ thống sinh câu hỏi thường gặp của Trường Đại học Hùng Vương."
|
| 10 |
+
/>
|
| 11 |
+
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
| 12 |
+
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
| 13 |
+
<link
|
| 14 |
+
href="https://fonts.googleapis.com/css2?family=Be+Vietnam+Pro:wght@400;500;600;700;800&display=swap"
|
| 15 |
+
rel="stylesheet"
|
| 16 |
+
/>
|
| 17 |
+
<link rel="stylesheet" href="/frontend/style.css" />
|
| 18 |
+
<script src="/frontend/app.js" defer></script>
|
| 19 |
+
</head>
|
| 20 |
+
<body>
|
| 21 |
+
<div class="page-shell">
|
| 22 |
+
<aside class="sidebar" id="sidebar">
|
| 23 |
+
<div class="sidebar-top">
|
| 24 |
+
<button
|
| 25 |
+
class="menu-toggle"
|
| 26 |
+
id="menuToggle"
|
| 27 |
+
type="button"
|
| 28 |
+
aria-label="Mở hoặc đóng thanh bên"
|
| 29 |
+
aria-expanded="false"
|
| 30 |
+
aria-controls="sidebarContent"
|
| 31 |
+
>
|
| 32 |
+
<span></span>
|
| 33 |
+
<span></span>
|
| 34 |
+
<span></span>
|
| 35 |
+
</button>
|
| 36 |
+
</div>
|
| 37 |
+
|
| 38 |
+
<div class="sidebar-content" id="sidebarContent" aria-hidden="true">
|
| 39 |
+
<section class="side-card">
|
| 40 |
+
<p class="side-label">Chọn model</p>
|
| 41 |
+
<label class="select-shell">
|
| 42 |
+
<span class="select-icon" aria-hidden="true">
|
| 43 |
+
<svg viewBox="0 0 24 24" fill="none">
|
| 44 |
+
<path
|
| 45 |
+
d="M12 3 4.5 7.2v9.6L12 21l7.5-4.2V7.2L12 3Zm0 0v8.4m0 0 7.5-4.2M12 11.4 4.5 7.2"
|
| 46 |
+
stroke="currentColor"
|
| 47 |
+
stroke-linecap="round"
|
| 48 |
+
stroke-linejoin="round"
|
| 49 |
+
stroke-width="1.6"
|
| 50 |
+
/>
|
| 51 |
+
</svg>
|
| 52 |
+
</span>
|
| 53 |
+
<select id="modelSelect" disabled>
|
| 54 |
+
<option>Đang tải danh sách model...</option>
|
| 55 |
+
</select>
|
| 56 |
+
</label>
|
| 57 |
+
</section>
|
| 58 |
+
|
| 59 |
+
<section class="side-card status-card">
|
| 60 |
+
<div class="status-chip">
|
| 61 |
+
<span class="status-dot" aria-hidden="true"></span>
|
| 62 |
+
<p id="deviceStatus">Đang kiểm tra thiết bị...</p>
|
| 63 |
+
</div>
|
| 64 |
+
<p class="status-note" id="modelStatus">Model sẽ được nạp ở lần sinh câu hỏi đầu tiên.</p>
|
| 65 |
+
</section>
|
| 66 |
+
|
| 67 |
+
<section class="side-card history-card">
|
| 68 |
+
<div class="history-header">
|
| 69 |
+
<p class="side-label">Lịch sử</p>
|
| 70 |
+
<button id="clearHistory" type="button">Xóa</button>
|
| 71 |
+
</div>
|
| 72 |
+
<div class="history-list" id="historyList"></div>
|
| 73 |
+
</section>
|
| 74 |
+
|
| 75 |
+
<section class="author-card">
|
| 76 |
+
<button
|
| 77 |
+
class="author-toggle"
|
| 78 |
+
id="authorToggle"
|
| 79 |
+
type="button"
|
| 80 |
+
aria-expanded="false"
|
| 81 |
+
aria-controls="authorContent"
|
| 82 |
+
>
|
| 83 |
+
<div class="author-header">
|
| 84 |
+
<div class="author-header-icon" aria-hidden="true">
|
| 85 |
+
<svg viewBox="0 0 24 24" fill="none">
|
| 86 |
+
<path
|
| 87 |
+
d="M12 12a4 4 0 1 0 0-8 4 4 0 0 0 0 8Zm-7 8c0-3.314 3.134-6 7-6s7 2.686 7 6"
|
| 88 |
+
stroke="currentColor"
|
| 89 |
+
stroke-linecap="round"
|
| 90 |
+
stroke-linejoin="round"
|
| 91 |
+
stroke-width="1.8"
|
| 92 |
+
/>
|
| 93 |
+
</svg>
|
| 94 |
+
</div>
|
| 95 |
+
<div>
|
| 96 |
+
<p class="author-title">Tác giả</p>
|
| 97 |
+
</div>
|
| 98 |
+
</div>
|
| 99 |
+
<span class="author-toggle-icon" aria-hidden="true">
|
| 100 |
+
<svg viewBox="0 0 24 24" fill="none">
|
| 101 |
+
<path
|
| 102 |
+
d="m8 10 4 4 4-4"
|
| 103 |
+
stroke="currentColor"
|
| 104 |
+
stroke-linecap="round"
|
| 105 |
+
stroke-linejoin="round"
|
| 106 |
+
stroke-width="1.8"
|
| 107 |
+
/>
|
| 108 |
+
</svg>
|
| 109 |
+
</span>
|
| 110 |
+
</button>
|
| 111 |
+
|
| 112 |
+
<div class="author-content" id="authorContent" hidden>
|
| 113 |
+
<div class="author-grid" id="authorList"></div>
|
| 114 |
+
</div>
|
| 115 |
+
|
| 116 |
+
<div class="author-footer">
|
| 117 |
+
<p id="copyrightLine">© 2026 HVU - KTCN</p>
|
| 118 |
+
</div>
|
| 119 |
+
</section>
|
| 120 |
+
</div>
|
| 121 |
+
</aside>
|
| 122 |
+
|
| 123 |
+
<main class="workspace">
|
| 124 |
+
<header class="topbar">
|
| 125 |
+
<div class="identity">
|
| 126 |
+
<img class="logo" src="/assets/HVU.png" alt="Logo Trường Đại học Hùng Vương" />
|
| 127 |
+
<div class="identity-copy">
|
| 128 |
+
<h1>Trường Đại học Hùng Vương</h1>
|
| 129 |
+
<p>Khoa Kỹ thuật - công nghệ</p>
|
| 130 |
+
</div>
|
| 131 |
+
</div>
|
| 132 |
+
</header>
|
| 133 |
+
|
| 134 |
+
<section class="hero-panel">
|
| 135 |
+
<div class="hero-copy">
|
| 136 |
+
<h2>
|
| 137 |
+
<span
|
| 138 |
+
class="typewriter-text"
|
| 139 |
+
id="heroTitle"
|
| 140 |
+
data-text="Mô hình sinh câu hỏi thường gặp"
|
| 141 |
+
aria-label="Mô hình sinh câu hỏi thường gặp"
|
| 142 |
+
></span>
|
| 143 |
+
</h2>
|
| 144 |
+
</div>
|
| 145 |
+
</section>
|
| 146 |
+
|
| 147 |
+
<section class="result-card" id="resultCard" aria-live="polite" hidden></section>
|
| 148 |
+
|
| 149 |
+
<form class="composer" id="generatorForm">
|
| 150 |
+
<div class="input-shell" id="sourceShell">
|
| 151 |
+
<label class="visually-hidden" for="sourceText">Nhập đoạn văn bản</label>
|
| 152 |
+
<textarea
|
| 153 |
+
id="sourceText"
|
| 154 |
+
name="sourceText"
|
| 155 |
+
rows="1"
|
| 156 |
+
placeholder="Nhập đoạn văn bản ..."
|
| 157 |
+
required
|
| 158 |
+
></textarea>
|
| 159 |
+
|
| 160 |
+
<div class="composer-actions">
|
| 161 |
+
<div class="count-shell" aria-label="Số câu hỏi">
|
| 162 |
+
<span class="count-label">Số câu hỏi</span>
|
| 163 |
+
<div class="count-stepper" role="group" aria-label="Điều chỉnh số câu hỏi">
|
| 164 |
+
<button class="count-button" id="decreaseCount" type="button" aria-label="Giảm số câu hỏi">
|
| 165 |
+
<span aria-hidden="true">-</span>
|
| 166 |
+
</button>
|
| 167 |
+
<output class="count-value" id="questionCountValue" for="questionCount">0</output>
|
| 168 |
+
<button class="count-button" id="increaseCount" type="button" aria-label="Tăng số câu hỏi">
|
| 169 |
+
<span aria-hidden="true">+</span>
|
| 170 |
+
</button>
|
| 171 |
+
</div>
|
| 172 |
+
<input id="questionCount" name="questionCount" type="hidden" value="0" />
|
| 173 |
+
</div>
|
| 174 |
+
|
| 175 |
+
<div class="action-cluster">
|
| 176 |
+
<span class="voice-status is-empty" id="voiceStatus" aria-live="polite"></span>
|
| 177 |
+
<div class="action-buttons">
|
| 178 |
+
<button
|
| 179 |
+
class="voice-button"
|
| 180 |
+
id="voiceInputButton"
|
| 181 |
+
type="button"
|
| 182 |
+
aria-label="Nhập bằng giọng nói qua trình duyệt"
|
| 183 |
+
>
|
| 184 |
+
<span class="voice-button-icon" aria-hidden="true">
|
| 185 |
+
<svg viewBox="0 0 24 24" fill="none">
|
| 186 |
+
<path
|
| 187 |
+
d="M12 15a3.5 3.5 0 0 0 3.5-3.5v-4a3.5 3.5 0 1 0-7 0v4A3.5 3.5 0 0 0 12 15Z"
|
| 188 |
+
stroke="currentColor"
|
| 189 |
+
stroke-linecap="round"
|
| 190 |
+
stroke-linejoin="round"
|
| 191 |
+
stroke-width="1.8"
|
| 192 |
+
/>
|
| 193 |
+
<path
|
| 194 |
+
d="M6.5 11.5a5.5 5.5 0 1 0 11 0M12 17v3m-3 0h6"
|
| 195 |
+
stroke="currentColor"
|
| 196 |
+
stroke-linecap="round"
|
| 197 |
+
stroke-linejoin="round"
|
| 198 |
+
stroke-width="1.8"
|
| 199 |
+
/>
|
| 200 |
+
</svg>
|
| 201 |
+
</span>
|
| 202 |
+
</button>
|
| 203 |
+
|
| 204 |
+
<button class="generate-button" id="generateButton" type="submit">
|
| 205 |
+
<span class="atom-loader atom-loader-sm" aria-hidden="true">
|
| 206 |
+
<span class="atom-core"></span>
|
| 207 |
+
<span class="atom-orbit atom-orbit-a"><span class="atom-electron"></span></span>
|
| 208 |
+
<span class="atom-orbit atom-orbit-b"><span class="atom-electron"></span></span>
|
| 209 |
+
<span class="atom-orbit atom-orbit-c"><span class="atom-electron"></span></span>
|
| 210 |
+
</span>
|
| 211 |
+
<span class="button-label">Sinh câu hỏi</span>
|
| 212 |
+
</button>
|
| 213 |
+
</div>
|
| 214 |
+
</div>
|
| 215 |
+
</div>
|
| 216 |
+
</div>
|
| 217 |
+
</form>
|
| 218 |
+
|
| 219 |
+
<section class="landing-panel" id="landingPanel">
|
| 220 |
+
<article class="landing-card landing-guide">
|
| 221 |
+
<div class="landing-card-head">
|
| 222 |
+
<p class="landing-kicker">Hướng dẫn nhanh</p>
|
| 223 |
+
<p class="landing-card-note">Tạo bộ câu hỏi từ đoạn văn bản đầu vào.</p>
|
| 224 |
+
</div>
|
| 225 |
+
<ol class="landing-guide-list">
|
| 226 |
+
<li>Nhập hoặc dán đoạn văn bản vào ô nhập.</li>
|
| 227 |
+
<li>Chọn số lượng câu hỏi cần sinh.</li>
|
| 228 |
+
<li>Nhấn <strong>Sinh câu hỏi</strong> để hệ thống xử lý.</li>
|
| 229 |
+
</ol>
|
| 230 |
+
</article>
|
| 231 |
+
|
| 232 |
+
<article class="landing-card landing-samples">
|
| 233 |
+
<div class="landing-card-head">
|
| 234 |
+
<p class="landing-kicker">Ví dụ mẫu</p>
|
| 235 |
+
<p class="landing-card-note">Chọn văn bản luật mẫu để chèn nhanh nội dung thử nghiệm.</p>
|
| 236 |
+
</div>
|
| 237 |
+
<div class="landing-sample-grid" id="sampleList"></div>
|
| 238 |
+
</article>
|
| 239 |
+
|
| 240 |
+
<article class="landing-card landing-system">
|
| 241 |
+
<div class="landing-card-head">
|
| 242 |
+
<p class="landing-kicker">Trạng thái model</p>
|
| 243 |
+
<span class="landing-runtime-badge is-pending" id="landingRuntimeBadge">Đang kiểm tra</span>
|
| 244 |
+
</div>
|
| 245 |
+
<p class="landing-card-note" id="landingStatusText">Đang kết nối backend và đọc cấu hình hệ thống.</p>
|
| 246 |
+
<div class="landing-system-list">
|
| 247 |
+
<div class="landing-system-row">
|
| 248 |
+
<span>Model đang dùng</span>
|
| 249 |
+
<strong id="landingModelName">Đang tải...</strong>
|
| 250 |
+
</div>
|
| 251 |
+
<div class="landing-system-row">
|
| 252 |
+
<span>Thiết bị xử lý</span>
|
| 253 |
+
<strong id="landingDeviceName">Đang kiểm tra...</strong>
|
| 254 |
+
</div>
|
| 255 |
+
<div class="landing-system-row">
|
| 256 |
+
<span>Số model khả dụng</span>
|
| 257 |
+
<strong id="landingModelCount">0</strong>
|
| 258 |
+
</div>
|
| 259 |
+
</div>
|
| 260 |
+
</article>
|
| 261 |
+
</section>
|
| 262 |
+
</main>
|
| 263 |
+
</div>
|
| 264 |
+
</body>
|
| 265 |
+
</html>
|
HVU_QA/frontend/style.css
ADDED
|
@@ -0,0 +1,1792 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
:root {
|
| 2 |
+
--bg: #f6f4fb;
|
| 3 |
+
--panel: rgba(255, 255, 255, 0.88);
|
| 4 |
+
--panel-soft: rgba(250, 248, 255, 0.92);
|
| 5 |
+
--line: rgba(92, 85, 168, 0.12);
|
| 6 |
+
--line-strong: rgba(92, 85, 168, 0.22);
|
| 7 |
+
--text: #232343;
|
| 8 |
+
--text-soft: #66648a;
|
| 9 |
+
--text-muted: #9793b5;
|
| 10 |
+
--accent: #5c63e7;
|
| 11 |
+
--accent-strong: #434dc7;
|
| 12 |
+
--accent-soft: rgba(92, 99, 231, 0.09);
|
| 13 |
+
--warm: #e0607c;
|
| 14 |
+
--gold: #f0b558;
|
| 15 |
+
--shadow-main: 0 28px 64px rgba(70, 62, 132, 0.12);
|
| 16 |
+
--shadow-soft: 0 12px 28px rgba(97, 88, 171, 0.08);
|
| 17 |
+
--radius-xl: 30px;
|
| 18 |
+
--radius-lg: 24px;
|
| 19 |
+
--radius-md: 18px;
|
| 20 |
+
--radius-sm: 14px;
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
* {
|
| 24 |
+
box-sizing: border-box;
|
| 25 |
+
}
|
| 26 |
+
|
| 27 |
+
html,
|
| 28 |
+
body {
|
| 29 |
+
min-height: 100%;
|
| 30 |
+
}
|
| 31 |
+
|
| 32 |
+
body {
|
| 33 |
+
margin: 0;
|
| 34 |
+
font-family: "Be Vietnam Pro", "Segoe UI", sans-serif;
|
| 35 |
+
color: var(--text);
|
| 36 |
+
background:
|
| 37 |
+
radial-gradient(circle at 15% 15%, rgba(92, 99, 231, 0.08), transparent 22%),
|
| 38 |
+
radial-gradient(circle at 85% 85%, rgba(224, 96, 124, 0.1), transparent 18%),
|
| 39 |
+
linear-gradient(180deg, #f8f6fc 0%, #f2eef9 100%);
|
| 40 |
+
}
|
| 41 |
+
|
| 42 |
+
button,
|
| 43 |
+
input,
|
| 44 |
+
select,
|
| 45 |
+
textarea {
|
| 46 |
+
font: inherit;
|
| 47 |
+
}
|
| 48 |
+
|
| 49 |
+
.page-shell {
|
| 50 |
+
--sidebar-width: 96px;
|
| 51 |
+
width: min(1420px, calc(100vw - 32px));
|
| 52 |
+
min-height: calc(100vh - 32px);
|
| 53 |
+
margin: 16px auto;
|
| 54 |
+
display: grid;
|
| 55 |
+
grid-template-columns: var(--sidebar-width) minmax(0, 1fr);
|
| 56 |
+
background: linear-gradient(180deg, rgba(255, 255, 255, 0.74), rgba(246, 241, 255, 0.82));
|
| 57 |
+
border: 1px solid rgba(255, 255, 255, 0.88);
|
| 58 |
+
border-radius: 28px;
|
| 59 |
+
box-shadow: var(--shadow-main);
|
| 60 |
+
overflow: hidden;
|
| 61 |
+
backdrop-filter: blur(18px);
|
| 62 |
+
transition: grid-template-columns 0.28s ease;
|
| 63 |
+
}
|
| 64 |
+
|
| 65 |
+
body.sidebar-open .page-shell {
|
| 66 |
+
--sidebar-width: 360px;
|
| 67 |
+
}
|
| 68 |
+
|
| 69 |
+
.sidebar {
|
| 70 |
+
padding: 18px;
|
| 71 |
+
background: linear-gradient(180deg, rgba(247, 244, 255, 0.98), rgba(239, 244, 255, 0.96));
|
| 72 |
+
border-right: 1px solid var(--line);
|
| 73 |
+
display: flex;
|
| 74 |
+
flex-direction: column;
|
| 75 |
+
gap: 14px;
|
| 76 |
+
}
|
| 77 |
+
|
| 78 |
+
.sidebar-top {
|
| 79 |
+
min-height: 64px;
|
| 80 |
+
display: flex;
|
| 81 |
+
align-items: center;
|
| 82 |
+
justify-content: center;
|
| 83 |
+
}
|
| 84 |
+
|
| 85 |
+
.menu-toggle {
|
| 86 |
+
width: 56px;
|
| 87 |
+
height: 56px;
|
| 88 |
+
border: 0;
|
| 89 |
+
border-radius: 18px;
|
| 90 |
+
background: #fff;
|
| 91 |
+
box-shadow: var(--shadow-soft);
|
| 92 |
+
display: inline-grid;
|
| 93 |
+
place-content: center;
|
| 94 |
+
gap: 5px;
|
| 95 |
+
cursor: pointer;
|
| 96 |
+
transition: transform 0.2s ease, box-shadow 0.2s ease;
|
| 97 |
+
}
|
| 98 |
+
|
| 99 |
+
.menu-toggle:hover {
|
| 100 |
+
transform: translateY(-1px);
|
| 101 |
+
box-shadow: 0 16px 30px rgba(97, 88, 171, 0.12);
|
| 102 |
+
}
|
| 103 |
+
|
| 104 |
+
.menu-toggle span {
|
| 105 |
+
width: 24px;
|
| 106 |
+
height: 3px;
|
| 107 |
+
border-radius: 999px;
|
| 108 |
+
background: #5d5a8c;
|
| 109 |
+
transition: transform 0.24s ease, opacity 0.24s ease;
|
| 110 |
+
}
|
| 111 |
+
|
| 112 |
+
body.sidebar-open .menu-toggle span:nth-child(1) {
|
| 113 |
+
transform: translateY(8px) rotate(45deg);
|
| 114 |
+
}
|
| 115 |
+
|
| 116 |
+
body.sidebar-open .menu-toggle span:nth-child(2) {
|
| 117 |
+
opacity: 0;
|
| 118 |
+
}
|
| 119 |
+
|
| 120 |
+
body.sidebar-open .menu-toggle span:nth-child(3) {
|
| 121 |
+
transform: translateY(-8px) rotate(-45deg);
|
| 122 |
+
}
|
| 123 |
+
|
| 124 |
+
.sidebar-content {
|
| 125 |
+
display: grid;
|
| 126 |
+
gap: 14px;
|
| 127 |
+
opacity: 0;
|
| 128 |
+
max-height: 0;
|
| 129 |
+
overflow: hidden;
|
| 130 |
+
pointer-events: none;
|
| 131 |
+
transform: translateY(-8px);
|
| 132 |
+
transition: opacity 0.22s ease, transform 0.22s ease, max-height 0.28s ease;
|
| 133 |
+
}
|
| 134 |
+
|
| 135 |
+
body.sidebar-open .sidebar-content {
|
| 136 |
+
opacity: 1;
|
| 137 |
+
max-height: 2000px;
|
| 138 |
+
pointer-events: auto;
|
| 139 |
+
transform: translateY(0);
|
| 140 |
+
}
|
| 141 |
+
|
| 142 |
+
.side-card,
|
| 143 |
+
.author-card,
|
| 144 |
+
.result-card,
|
| 145 |
+
.composer {
|
| 146 |
+
border: 1px solid rgba(255, 255, 255, 0.92);
|
| 147 |
+
box-shadow: var(--shadow-soft);
|
| 148 |
+
}
|
| 149 |
+
|
| 150 |
+
.side-card {
|
| 151 |
+
padding: 14px;
|
| 152 |
+
border-radius: 20px;
|
| 153 |
+
background: linear-gradient(180deg, rgba(242, 245, 255, 0.98), rgba(252, 252, 255, 0.94));
|
| 154 |
+
border-color: rgba(151, 161, 240, 0.16);
|
| 155 |
+
}
|
| 156 |
+
|
| 157 |
+
.side-label {
|
| 158 |
+
margin: 0 0 12px;
|
| 159 |
+
font-size: 0.8rem;
|
| 160 |
+
font-weight: 700;
|
| 161 |
+
letter-spacing: 0.03em;
|
| 162 |
+
color: var(--text-muted);
|
| 163 |
+
}
|
| 164 |
+
|
| 165 |
+
.select-shell {
|
| 166 |
+
min-height: 54px;
|
| 167 |
+
display: flex;
|
| 168 |
+
align-items: center;
|
| 169 |
+
gap: 12px;
|
| 170 |
+
padding: 0 14px;
|
| 171 |
+
border-radius: 16px;
|
| 172 |
+
border: 1px solid var(--line);
|
| 173 |
+
background: #fff;
|
| 174 |
+
}
|
| 175 |
+
|
| 176 |
+
.select-icon {
|
| 177 |
+
width: 20px;
|
| 178 |
+
height: 20px;
|
| 179 |
+
color: var(--accent);
|
| 180 |
+
display: inline-flex;
|
| 181 |
+
}
|
| 182 |
+
|
| 183 |
+
.select-shell select {
|
| 184 |
+
width: 100%;
|
| 185 |
+
border: 0;
|
| 186 |
+
outline: none;
|
| 187 |
+
background: transparent;
|
| 188 |
+
color: #585480;
|
| 189 |
+
font-weight: 500;
|
| 190 |
+
}
|
| 191 |
+
|
| 192 |
+
.status-card {
|
| 193 |
+
display: grid;
|
| 194 |
+
gap: 8px;
|
| 195 |
+
}
|
| 196 |
+
|
| 197 |
+
.status-chip {
|
| 198 |
+
display: flex;
|
| 199 |
+
align-items: center;
|
| 200 |
+
gap: 10px;
|
| 201 |
+
}
|
| 202 |
+
|
| 203 |
+
.status-chip p,
|
| 204 |
+
.status-note {
|
| 205 |
+
margin: 0;
|
| 206 |
+
}
|
| 207 |
+
|
| 208 |
+
.status-chip p {
|
| 209 |
+
font-size: 0.92rem;
|
| 210 |
+
font-weight: 600;
|
| 211 |
+
color: #57527d;
|
| 212 |
+
}
|
| 213 |
+
|
| 214 |
+
.status-note {
|
| 215 |
+
font-size: 0.84rem;
|
| 216 |
+
line-height: 1.6;
|
| 217 |
+
color: var(--text-muted);
|
| 218 |
+
}
|
| 219 |
+
|
| 220 |
+
.status-dot {
|
| 221 |
+
width: 10px;
|
| 222 |
+
height: 10px;
|
| 223 |
+
border-radius: 50%;
|
| 224 |
+
background: linear-gradient(135deg, var(--warm), var(--accent));
|
| 225 |
+
box-shadow: 0 0 0 5px rgba(92, 99, 231, 0.1);
|
| 226 |
+
}
|
| 227 |
+
|
| 228 |
+
.history-card {
|
| 229 |
+
min-height: 240px;
|
| 230 |
+
display: flex;
|
| 231 |
+
flex-direction: column;
|
| 232 |
+
}
|
| 233 |
+
|
| 234 |
+
.history-header {
|
| 235 |
+
display: flex;
|
| 236 |
+
align-items: center;
|
| 237 |
+
justify-content: space-between;
|
| 238 |
+
gap: 12px;
|
| 239 |
+
margin-bottom: 8px;
|
| 240 |
+
}
|
| 241 |
+
|
| 242 |
+
.history-header .side-label {
|
| 243 |
+
margin: 0;
|
| 244 |
+
}
|
| 245 |
+
|
| 246 |
+
.history-header button {
|
| 247 |
+
border: 0;
|
| 248 |
+
background: transparent;
|
| 249 |
+
color: var(--accent);
|
| 250 |
+
font-weight: 600;
|
| 251 |
+
cursor: pointer;
|
| 252 |
+
}
|
| 253 |
+
|
| 254 |
+
.history-list {
|
| 255 |
+
display: grid;
|
| 256 |
+
gap: 10px;
|
| 257 |
+
max-height: 100%;
|
| 258 |
+
overflow: auto;
|
| 259 |
+
padding-right: 4px;
|
| 260 |
+
}
|
| 261 |
+
|
| 262 |
+
.history-list::-webkit-scrollbar {
|
| 263 |
+
width: 8px;
|
| 264 |
+
}
|
| 265 |
+
|
| 266 |
+
.history-list::-webkit-scrollbar-thumb {
|
| 267 |
+
border-radius: 999px;
|
| 268 |
+
background: rgba(92, 99, 231, 0.2);
|
| 269 |
+
}
|
| 270 |
+
|
| 271 |
+
.history-item {
|
| 272 |
+
width: 100%;
|
| 273 |
+
border: 1px solid transparent;
|
| 274 |
+
border-radius: 16px;
|
| 275 |
+
background: rgba(255, 255, 255, 0.82);
|
| 276 |
+
padding: 12px 14px;
|
| 277 |
+
display: grid;
|
| 278 |
+
grid-template-columns: auto 1fr;
|
| 279 |
+
gap: 12px;
|
| 280 |
+
align-items: start;
|
| 281 |
+
text-align: left;
|
| 282 |
+
color: inherit;
|
| 283 |
+
cursor: pointer;
|
| 284 |
+
transition: transform 0.18s ease, box-shadow 0.18s ease, border-color 0.18s ease;
|
| 285 |
+
}
|
| 286 |
+
|
| 287 |
+
.history-item:hover,
|
| 288 |
+
.history-item.is-active {
|
| 289 |
+
transform: translateY(-1px);
|
| 290 |
+
border-color: rgba(92, 99, 231, 0.18);
|
| 291 |
+
box-shadow: 0 12px 24px rgba(92, 99, 231, 0.1);
|
| 292 |
+
}
|
| 293 |
+
|
| 294 |
+
.history-icon {
|
| 295 |
+
width: 24px;
|
| 296 |
+
height: 24px;
|
| 297 |
+
color: var(--accent);
|
| 298 |
+
display: inline-grid;
|
| 299 |
+
place-items: center;
|
| 300 |
+
margin-top: 3px;
|
| 301 |
+
}
|
| 302 |
+
|
| 303 |
+
.history-main strong,
|
| 304 |
+
.history-main span {
|
| 305 |
+
display: block;
|
| 306 |
+
}
|
| 307 |
+
|
| 308 |
+
.history-main strong {
|
| 309 |
+
font-size: 0.95rem;
|
| 310 |
+
line-height: 1.45;
|
| 311 |
+
font-weight: 600;
|
| 312 |
+
color: #4a466e;
|
| 313 |
+
}
|
| 314 |
+
|
| 315 |
+
.history-main span {
|
| 316 |
+
margin-top: 4px;
|
| 317 |
+
font-size: 0.8rem;
|
| 318 |
+
color: var(--text-muted);
|
| 319 |
+
}
|
| 320 |
+
|
| 321 |
+
.history-empty {
|
| 322 |
+
padding: 14px;
|
| 323 |
+
border-radius: 16px;
|
| 324 |
+
background: rgba(255, 255, 255, 0.7);
|
| 325 |
+
color: var(--text-muted);
|
| 326 |
+
line-height: 1.6;
|
| 327 |
+
}
|
| 328 |
+
|
| 329 |
+
.author-card {
|
| 330 |
+
border-radius: 22px;
|
| 331 |
+
padding: 16px;
|
| 332 |
+
display: grid;
|
| 333 |
+
gap: 12px;
|
| 334 |
+
background: linear-gradient(180deg, rgba(255, 243, 247, 0.98), rgba(247, 242, 255, 0.96));
|
| 335 |
+
border-color: rgba(236, 158, 182, 0.18);
|
| 336 |
+
}
|
| 337 |
+
|
| 338 |
+
.author-toggle {
|
| 339 |
+
min-width: 0;
|
| 340 |
+
width: 100%;
|
| 341 |
+
padding: 0;
|
| 342 |
+
border: 0;
|
| 343 |
+
background: transparent;
|
| 344 |
+
color: inherit;
|
| 345 |
+
display: flex;
|
| 346 |
+
align-items: center;
|
| 347 |
+
justify-content: space-between;
|
| 348 |
+
gap: 12px;
|
| 349 |
+
text-align: left;
|
| 350 |
+
cursor: pointer;
|
| 351 |
+
}
|
| 352 |
+
|
| 353 |
+
.author-toggle:focus-visible {
|
| 354 |
+
outline: 2px solid rgba(92, 99, 231, 0.24);
|
| 355 |
+
outline-offset: 4px;
|
| 356 |
+
border-radius: 18px;
|
| 357 |
+
}
|
| 358 |
+
|
| 359 |
+
.author-header {
|
| 360 |
+
min-width: 0;
|
| 361 |
+
display: grid;
|
| 362 |
+
grid-template-columns: auto 1fr;
|
| 363 |
+
gap: 12px;
|
| 364 |
+
align-items: center;
|
| 365 |
+
}
|
| 366 |
+
|
| 367 |
+
.author-header > :last-child {
|
| 368 |
+
min-width: 0;
|
| 369 |
+
}
|
| 370 |
+
|
| 371 |
+
.author-toggle-icon {
|
| 372 |
+
width: 34px;
|
| 373 |
+
height: 34px;
|
| 374 |
+
border-radius: 12px;
|
| 375 |
+
background: rgba(255, 255, 255, 0.8);
|
| 376 |
+
color: #7b75a6;
|
| 377 |
+
display: inline-grid;
|
| 378 |
+
place-items: center;
|
| 379 |
+
flex: none;
|
| 380 |
+
transition: transform 0.18s ease, background 0.18s ease, color 0.18s ease;
|
| 381 |
+
}
|
| 382 |
+
|
| 383 |
+
.author-toggle-icon svg {
|
| 384 |
+
width: 18px;
|
| 385 |
+
height: 18px;
|
| 386 |
+
}
|
| 387 |
+
|
| 388 |
+
.author-toggle.is-open .author-toggle-icon {
|
| 389 |
+
transform: rotate(180deg);
|
| 390 |
+
background: rgba(240, 234, 255, 0.98);
|
| 391 |
+
color: var(--accent-strong);
|
| 392 |
+
}
|
| 393 |
+
|
| 394 |
+
.author-header-icon {
|
| 395 |
+
width: 44px;
|
| 396 |
+
height: 44px;
|
| 397 |
+
border-radius: 16px;
|
| 398 |
+
background: linear-gradient(180deg, rgba(240, 228, 255, 0.98), rgba(255, 240, 246, 0.98));
|
| 399 |
+
color: var(--accent);
|
| 400 |
+
display: inline-grid;
|
| 401 |
+
place-items: center;
|
| 402 |
+
box-shadow: 0 10px 20px rgba(123, 117, 166, 0.08);
|
| 403 |
+
}
|
| 404 |
+
|
| 405 |
+
.author-header-icon svg {
|
| 406 |
+
width: 20px;
|
| 407 |
+
height: 20px;
|
| 408 |
+
}
|
| 409 |
+
|
| 410 |
+
.author-title,
|
| 411 |
+
.author-footer p,
|
| 412 |
+
.author-person-summary,
|
| 413 |
+
.author-person-meta-label,
|
| 414 |
+
.author-person-meta-value {
|
| 415 |
+
margin: 0;
|
| 416 |
+
}
|
| 417 |
+
|
| 418 |
+
.author-title {
|
| 419 |
+
font-size: 0.82rem;
|
| 420 |
+
font-weight: 800;
|
| 421 |
+
letter-spacing: 0.04em;
|
| 422 |
+
color: #7b75a6;
|
| 423 |
+
overflow-wrap: anywhere;
|
| 424 |
+
word-break: break-word;
|
| 425 |
+
white-space: normal;
|
| 426 |
+
}
|
| 427 |
+
|
| 428 |
+
.author-grid {
|
| 429 |
+
min-width: 0;
|
| 430 |
+
display: grid;
|
| 431 |
+
gap: 10px;
|
| 432 |
+
}
|
| 433 |
+
|
| 434 |
+
.author-content {
|
| 435 |
+
min-width: 0;
|
| 436 |
+
display: grid;
|
| 437 |
+
gap: 12px;
|
| 438 |
+
}
|
| 439 |
+
|
| 440 |
+
.author-content[hidden] {
|
| 441 |
+
display: none;
|
| 442 |
+
}
|
| 443 |
+
|
| 444 |
+
.author-person {
|
| 445 |
+
min-width: 0;
|
| 446 |
+
width: 100%;
|
| 447 |
+
border: 1px solid rgba(184, 167, 224, 0.24);
|
| 448 |
+
border-radius: 18px;
|
| 449 |
+
background: rgba(255, 255, 255, 0.74);
|
| 450 |
+
display: grid;
|
| 451 |
+
overflow: hidden;
|
| 452 |
+
transition:
|
| 453 |
+
transform 0.18s ease,
|
| 454 |
+
box-shadow 0.18s ease,
|
| 455 |
+
border-color 0.18s ease,
|
| 456 |
+
background 0.18s ease;
|
| 457 |
+
}
|
| 458 |
+
|
| 459 |
+
.author-person:hover,
|
| 460 |
+
.author-person.is-active {
|
| 461 |
+
transform: translateY(-1px);
|
| 462 |
+
border-color: rgba(92, 99, 231, 0.22);
|
| 463 |
+
background: rgba(255, 255, 255, 0.94);
|
| 464 |
+
box-shadow: 0 12px 24px rgba(92, 99, 231, 0.1);
|
| 465 |
+
}
|
| 466 |
+
|
| 467 |
+
.author-person-trigger {
|
| 468 |
+
min-width: 0;
|
| 469 |
+
width: 100%;
|
| 470 |
+
padding: 14px;
|
| 471 |
+
border: 0;
|
| 472 |
+
background: transparent;
|
| 473 |
+
text-align: left;
|
| 474 |
+
color: inherit;
|
| 475 |
+
display: grid;
|
| 476 |
+
gap: 8px;
|
| 477 |
+
cursor: pointer;
|
| 478 |
+
}
|
| 479 |
+
|
| 480 |
+
.author-person-trigger > * {
|
| 481 |
+
min-width: 0;
|
| 482 |
+
}
|
| 483 |
+
|
| 484 |
+
.author-person-trigger:focus-visible {
|
| 485 |
+
outline: 2px solid rgba(92, 99, 231, 0.28);
|
| 486 |
+
outline-offset: 2px;
|
| 487 |
+
}
|
| 488 |
+
|
| 489 |
+
.author-person-top {
|
| 490 |
+
min-width: 0;
|
| 491 |
+
display: flex;
|
| 492 |
+
align-items: flex-start;
|
| 493 |
+
justify-content: space-between;
|
| 494 |
+
gap: 10px;
|
| 495 |
+
}
|
| 496 |
+
|
| 497 |
+
.author-person-role {
|
| 498 |
+
min-width: 0;
|
| 499 |
+
flex: 1 1 auto;
|
| 500 |
+
padding: 0;
|
| 501 |
+
color: var(--accent-strong);
|
| 502 |
+
display: block;
|
| 503 |
+
font-size: 0.74rem;
|
| 504 |
+
font-weight: 700;
|
| 505 |
+
line-height: 1.35;
|
| 506 |
+
overflow-wrap: anywhere;
|
| 507 |
+
word-break: break-word;
|
| 508 |
+
white-space: normal;
|
| 509 |
+
}
|
| 510 |
+
|
| 511 |
+
.author-person-chevron {
|
| 512 |
+
width: 18px;
|
| 513 |
+
height: 18px;
|
| 514 |
+
color: #7b75a6;
|
| 515 |
+
display: inline-grid;
|
| 516 |
+
place-items: center;
|
| 517 |
+
flex: none;
|
| 518 |
+
transition: transform 0.18s ease, color 0.18s ease;
|
| 519 |
+
}
|
| 520 |
+
|
| 521 |
+
.author-person-chevron svg {
|
| 522 |
+
width: 16px;
|
| 523 |
+
height: 16px;
|
| 524 |
+
}
|
| 525 |
+
|
| 526 |
+
.author-person.is-active .author-person-chevron {
|
| 527 |
+
transform: rotate(180deg);
|
| 528 |
+
color: var(--accent-strong);
|
| 529 |
+
}
|
| 530 |
+
|
| 531 |
+
.author-person strong {
|
| 532 |
+
display: block;
|
| 533 |
+
font-size: 1rem;
|
| 534 |
+
line-height: 1.35;
|
| 535 |
+
color: #403a67;
|
| 536 |
+
overflow-wrap: anywhere;
|
| 537 |
+
word-break: break-word;
|
| 538 |
+
white-space: normal;
|
| 539 |
+
}
|
| 540 |
+
|
| 541 |
+
.author-person-summary {
|
| 542 |
+
font-size: 0.84rem;
|
| 543 |
+
line-height: 1.55;
|
| 544 |
+
color: #6f6996;
|
| 545 |
+
overflow-wrap: anywhere;
|
| 546 |
+
word-break: break-word;
|
| 547 |
+
white-space: normal;
|
| 548 |
+
}
|
| 549 |
+
|
| 550 |
+
.author-person-body {
|
| 551 |
+
min-width: 0;
|
| 552 |
+
margin: 0 14px 14px;
|
| 553 |
+
padding-top: 12px;
|
| 554 |
+
border-top: 1px solid rgba(187, 171, 227, 0.2);
|
| 555 |
+
display: grid;
|
| 556 |
+
gap: 8px;
|
| 557 |
+
}
|
| 558 |
+
|
| 559 |
+
.author-person-body > * {
|
| 560 |
+
min-width: 0;
|
| 561 |
+
}
|
| 562 |
+
|
| 563 |
+
.author-person-meta {
|
| 564 |
+
min-width: 0;
|
| 565 |
+
display: grid;
|
| 566 |
+
gap: 8px;
|
| 567 |
+
}
|
| 568 |
+
|
| 569 |
+
.author-person-meta-row {
|
| 570 |
+
min-width: 0;
|
| 571 |
+
display: grid;
|
| 572 |
+
grid-template-columns: 1fr;
|
| 573 |
+
gap: 2px;
|
| 574 |
+
align-items: start;
|
| 575 |
+
}
|
| 576 |
+
|
| 577 |
+
.author-person-meta-label {
|
| 578 |
+
font-size: 0.75rem;
|
| 579 |
+
font-weight: 700;
|
| 580 |
+
letter-spacing: 0.03em;
|
| 581 |
+
color: #8a84b2;
|
| 582 |
+
white-space: normal;
|
| 583 |
+
}
|
| 584 |
+
|
| 585 |
+
.author-person-meta-value {
|
| 586 |
+
min-width: 0;
|
| 587 |
+
font-size: 0.88rem;
|
| 588 |
+
line-height: 1.4;
|
| 589 |
+
font-weight: 700;
|
| 590 |
+
color: #4e4878;
|
| 591 |
+
overflow-wrap: anywhere;
|
| 592 |
+
word-break: break-word;
|
| 593 |
+
white-space: normal;
|
| 594 |
+
}
|
| 595 |
+
|
| 596 |
+
.author-footer {
|
| 597 |
+
padding-top: 2px;
|
| 598 |
+
}
|
| 599 |
+
|
| 600 |
+
.author-footer p {
|
| 601 |
+
font-size: 0.9rem;
|
| 602 |
+
color: #6b648d;
|
| 603 |
+
overflow-wrap: anywhere;
|
| 604 |
+
word-break: break-word;
|
| 605 |
+
white-space: normal;
|
| 606 |
+
}
|
| 607 |
+
|
| 608 |
+
.workspace {
|
| 609 |
+
display: flex;
|
| 610 |
+
flex-direction: column;
|
| 611 |
+
min-width: 0;
|
| 612 |
+
background:
|
| 613 |
+
radial-gradient(circle at 50% 24%, rgba(92, 99, 231, 0.12), transparent 22%),
|
| 614 |
+
linear-gradient(180deg, rgba(255, 255, 255, 0.9), rgba(248, 244, 255, 0.9));
|
| 615 |
+
}
|
| 616 |
+
|
| 617 |
+
.topbar {
|
| 618 |
+
min-height: auto;
|
| 619 |
+
padding: 42px 34px 10px;
|
| 620 |
+
display: flex;
|
| 621 |
+
justify-content: center;
|
| 622 |
+
text-align: center;
|
| 623 |
+
background: transparent;
|
| 624 |
+
}
|
| 625 |
+
|
| 626 |
+
.identity {
|
| 627 |
+
display: grid;
|
| 628 |
+
justify-items: center;
|
| 629 |
+
gap: 16px;
|
| 630 |
+
}
|
| 631 |
+
|
| 632 |
+
.logo {
|
| 633 |
+
width: clamp(88px, 10vw, 110px);
|
| 634 |
+
height: clamp(88px, 10vw, 110px);
|
| 635 |
+
object-fit: contain;
|
| 636 |
+
flex: none;
|
| 637 |
+
filter: drop-shadow(0 12px 24px rgba(92, 99, 231, 0.12));
|
| 638 |
+
}
|
| 639 |
+
|
| 640 |
+
.identity-copy {
|
| 641 |
+
display: grid;
|
| 642 |
+
justify-items: center;
|
| 643 |
+
gap: 8px;
|
| 644 |
+
}
|
| 645 |
+
|
| 646 |
+
.identity-copy h1,
|
| 647 |
+
.identity-copy p,
|
| 648 |
+
.hero-copy h2,
|
| 649 |
+
.button-label {
|
| 650 |
+
margin: 0;
|
| 651 |
+
}
|
| 652 |
+
|
| 653 |
+
.identity-copy h1 {
|
| 654 |
+
max-width: none;
|
| 655 |
+
font-size: clamp(1.55rem, 2.45vw, 2.55rem);
|
| 656 |
+
line-height: 1.1;
|
| 657 |
+
font-weight: 800;
|
| 658 |
+
letter-spacing: -0.04em;
|
| 659 |
+
color: #22203d;
|
| 660 |
+
white-space: nowrap;
|
| 661 |
+
}
|
| 662 |
+
|
| 663 |
+
.identity-copy p {
|
| 664 |
+
font-size: clamp(1.08rem, 1.38vw, 1.42rem);
|
| 665 |
+
line-height: 1.2;
|
| 666 |
+
font-weight: 700;
|
| 667 |
+
color: #4d466f;
|
| 668 |
+
text-wrap: balance;
|
| 669 |
+
}
|
| 670 |
+
|
| 671 |
+
.hero-panel {
|
| 672 |
+
padding: 0 34px 18px;
|
| 673 |
+
display: grid;
|
| 674 |
+
justify-items: center;
|
| 675 |
+
gap: 10px;
|
| 676 |
+
text-align: center;
|
| 677 |
+
}
|
| 678 |
+
|
| 679 |
+
.hero-copy {
|
| 680 |
+
width: min(100%, 1180px);
|
| 681 |
+
display: grid;
|
| 682 |
+
justify-items: center;
|
| 683 |
+
}
|
| 684 |
+
|
| 685 |
+
.hero-copy h2 {
|
| 686 |
+
max-width: none;
|
| 687 |
+
min-height: auto;
|
| 688 |
+
font-size: clamp(1.65rem, 3vw, 2.9rem);
|
| 689 |
+
line-height: 1.12;
|
| 690 |
+
font-weight: 800;
|
| 691 |
+
letter-spacing: -0.04em;
|
| 692 |
+
color: var(--accent-strong);
|
| 693 |
+
white-space: nowrap;
|
| 694 |
+
}
|
| 695 |
+
|
| 696 |
+
.typewriter-text {
|
| 697 |
+
position: relative;
|
| 698 |
+
display: inline-block;
|
| 699 |
+
padding: 0.04em 0.08em 0.08em;
|
| 700 |
+
opacity: 0;
|
| 701 |
+
transform: translateY(18px) scale(0.985);
|
| 702 |
+
filter: blur(10px);
|
| 703 |
+
transition:
|
| 704 |
+
opacity 0.75s ease,
|
| 705 |
+
transform 0.75s cubic-bezier(0.2, 0.8, 0.2, 1),
|
| 706 |
+
filter 0.75s ease;
|
| 707 |
+
overflow: hidden;
|
| 708 |
+
}
|
| 709 |
+
|
| 710 |
+
.typewriter-text::before {
|
| 711 |
+
content: "";
|
| 712 |
+
position: absolute;
|
| 713 |
+
inset: -8% -5%;
|
| 714 |
+
background: linear-gradient(
|
| 715 |
+
112deg,
|
| 716 |
+
transparent 0%,
|
| 717 |
+
rgba(255, 255, 255, 0) 38%,
|
| 718 |
+
rgba(255, 255, 255, 0.78) 49%,
|
| 719 |
+
rgba(255, 255, 255, 0.18) 56%,
|
| 720 |
+
transparent 66%
|
| 721 |
+
);
|
| 722 |
+
transform: translateX(-135%) skewX(-18deg);
|
| 723 |
+
opacity: 0;
|
| 724 |
+
pointer-events: none;
|
| 725 |
+
}
|
| 726 |
+
|
| 727 |
+
.typewriter-text.is-ready {
|
| 728 |
+
opacity: 1;
|
| 729 |
+
transform: translateY(0) scale(1);
|
| 730 |
+
filter: blur(0);
|
| 731 |
+
text-shadow: 0 16px 28px rgba(92, 99, 231, 0.12);
|
| 732 |
+
}
|
| 733 |
+
|
| 734 |
+
.typewriter-text.is-ready::before {
|
| 735 |
+
opacity: 1;
|
| 736 |
+
animation: title-sheen 1.45s cubic-bezier(0.22, 1, 0.36, 1) 0.12s both;
|
| 737 |
+
}
|
| 738 |
+
|
| 739 |
+
.result-card {
|
| 740 |
+
margin: 6px 34px 18px;
|
| 741 |
+
padding: 0;
|
| 742 |
+
min-height: 0;
|
| 743 |
+
position: relative;
|
| 744 |
+
background: transparent;
|
| 745 |
+
border: 0;
|
| 746 |
+
box-shadow: none;
|
| 747 |
+
overflow: visible;
|
| 748 |
+
}
|
| 749 |
+
|
| 750 |
+
.result-card.is-visible {
|
| 751 |
+
animation: result-card-in 0.3s ease;
|
| 752 |
+
}
|
| 753 |
+
|
| 754 |
+
.result-card.is-updating {
|
| 755 |
+
box-shadow: none;
|
| 756 |
+
}
|
| 757 |
+
|
| 758 |
+
.result-feed {
|
| 759 |
+
display: grid;
|
| 760 |
+
gap: 16px;
|
| 761 |
+
}
|
| 762 |
+
|
| 763 |
+
.result-thread-item {
|
| 764 |
+
padding: 22px 24px;
|
| 765 |
+
border-radius: 24px;
|
| 766 |
+
background: rgba(255, 255, 255, 0.96);
|
| 767 |
+
border: 1px solid rgba(132, 141, 231, 0.14);
|
| 768 |
+
box-shadow: 0 14px 28px rgba(92, 99, 231, 0.06);
|
| 769 |
+
}
|
| 770 |
+
|
| 771 |
+
.result-thread-item:nth-child(even) {
|
| 772 |
+
background: rgba(255, 253, 249, 0.96);
|
| 773 |
+
border-color: rgba(240, 191, 143, 0.18);
|
| 774 |
+
}
|
| 775 |
+
|
| 776 |
+
.atom-loader {
|
| 777 |
+
--atom-size: 58px;
|
| 778 |
+
--atom-border: rgba(92, 99, 231, 0.26);
|
| 779 |
+
--atom-glow: rgba(92, 99, 231, 0.2);
|
| 780 |
+
position: relative;
|
| 781 |
+
width: var(--atom-size);
|
| 782 |
+
height: var(--atom-size);
|
| 783 |
+
display: inline-grid;
|
| 784 |
+
place-items: center;
|
| 785 |
+
border-radius: 50%;
|
| 786 |
+
filter: drop-shadow(0 10px 22px var(--atom-glow));
|
| 787 |
+
}
|
| 788 |
+
|
| 789 |
+
.atom-loader-sm {
|
| 790 |
+
--atom-size: 30px;
|
| 791 |
+
position: absolute;
|
| 792 |
+
left: 18px;
|
| 793 |
+
top: 50%;
|
| 794 |
+
transform: translateY(-50%);
|
| 795 |
+
z-index: 1;
|
| 796 |
+
}
|
| 797 |
+
|
| 798 |
+
.atom-loader-inline {
|
| 799 |
+
--atom-size: 42px;
|
| 800 |
+
}
|
| 801 |
+
|
| 802 |
+
.atom-core {
|
| 803 |
+
width: calc(var(--atom-size) * 0.28);
|
| 804 |
+
height: calc(var(--atom-size) * 0.28);
|
| 805 |
+
border-radius: 50%;
|
| 806 |
+
background: radial-gradient(circle at 35% 35%, #ffffff 0%, #ffe4eb 28%, #9ba9ff 68%, #5c63e7 100%);
|
| 807 |
+
box-shadow:
|
| 808 |
+
0 0 0 calc(var(--atom-size) * 0.085) rgba(255, 255, 255, 0.28),
|
| 809 |
+
0 0 calc(var(--atom-size) * 0.34) rgba(92, 99, 231, 0.24);
|
| 810 |
+
position: relative;
|
| 811 |
+
z-index: 2;
|
| 812 |
+
}
|
| 813 |
+
|
| 814 |
+
.atom-orbit {
|
| 815 |
+
position: absolute;
|
| 816 |
+
inset: 6%;
|
| 817 |
+
border: 1.5px solid var(--atom-border);
|
| 818 |
+
border-radius: 50%;
|
| 819 |
+
will-change: transform;
|
| 820 |
+
}
|
| 821 |
+
|
| 822 |
+
.atom-orbit-a {
|
| 823 |
+
transform: rotate(10deg);
|
| 824 |
+
}
|
| 825 |
+
|
| 826 |
+
.atom-orbit-b {
|
| 827 |
+
inset: 14%;
|
| 828 |
+
transform: rotate(72deg);
|
| 829 |
+
}
|
| 830 |
+
|
| 831 |
+
.atom-orbit-c {
|
| 832 |
+
inset: 14%;
|
| 833 |
+
transform: rotate(-58deg);
|
| 834 |
+
}
|
| 835 |
+
|
| 836 |
+
.atom-electron {
|
| 837 |
+
position: absolute;
|
| 838 |
+
width: calc(var(--atom-size) * 0.12);
|
| 839 |
+
height: calc(var(--atom-size) * 0.12);
|
| 840 |
+
border-radius: 50%;
|
| 841 |
+
box-shadow: 0 0 calc(var(--atom-size) * 0.18) rgba(255, 255, 255, 0.34);
|
| 842 |
+
}
|
| 843 |
+
|
| 844 |
+
.atom-orbit-a .atom-electron {
|
| 845 |
+
top: calc(var(--atom-size) * -0.045);
|
| 846 |
+
left: 50%;
|
| 847 |
+
transform: translateX(-50%);
|
| 848 |
+
background: radial-gradient(circle at 35% 35%, #ffffff 0%, #ffd7df 38%, #ff8ba7 100%);
|
| 849 |
+
}
|
| 850 |
+
|
| 851 |
+
.atom-orbit-b .atom-electron {
|
| 852 |
+
bottom: calc(var(--atom-size) * -0.05);
|
| 853 |
+
left: 50%;
|
| 854 |
+
transform: translateX(-50%);
|
| 855 |
+
background: radial-gradient(circle at 35% 35%, #ffffff 0%, #dbe2ff 42%, #7e90ff 100%);
|
| 856 |
+
}
|
| 857 |
+
|
| 858 |
+
.atom-orbit-c .atom-electron {
|
| 859 |
+
top: 50%;
|
| 860 |
+
right: calc(var(--atom-size) * -0.05);
|
| 861 |
+
transform: translateY(-50%);
|
| 862 |
+
background: radial-gradient(circle at 35% 35%, #ffffff 0%, #ffe6b9 40%, #f0b558 100%);
|
| 863 |
+
}
|
| 864 |
+
|
| 865 |
+
body.is-generating .result-card:not(.has-entry) .atom-orbit-a,
|
| 866 |
+
.generate-button.is-loading .atom-orbit-a {
|
| 867 |
+
animation: atom-orbit-a-spin 1.75s linear infinite;
|
| 868 |
+
}
|
| 869 |
+
|
| 870 |
+
body.is-generating .result-card:not(.has-entry) .atom-orbit-b,
|
| 871 |
+
.generate-button.is-loading .atom-orbit-b {
|
| 872 |
+
animation: atom-orbit-b-spin 1.2s linear infinite;
|
| 873 |
+
}
|
| 874 |
+
|
| 875 |
+
body.is-generating .result-card:not(.has-entry) .atom-orbit-c,
|
| 876 |
+
.generate-button.is-loading .atom-orbit-c {
|
| 877 |
+
animation: atom-orbit-c-spin 2.25s linear infinite;
|
| 878 |
+
}
|
| 879 |
+
|
| 880 |
+
body.is-generating .result-card:not(.has-entry) .atom-core,
|
| 881 |
+
.generate-button.is-loading .atom-core {
|
| 882 |
+
animation: atom-core-pulse 1s ease-in-out infinite alternate;
|
| 883 |
+
}
|
| 884 |
+
|
| 885 |
+
.result-meta {
|
| 886 |
+
display: flex;
|
| 887 |
+
flex-wrap: wrap;
|
| 888 |
+
gap: 8px;
|
| 889 |
+
margin-bottom: 16px;
|
| 890 |
+
}
|
| 891 |
+
|
| 892 |
+
.result-meta span {
|
| 893 |
+
min-height: 32px;
|
| 894 |
+
padding: 0 12px;
|
| 895 |
+
border-radius: 999px;
|
| 896 |
+
background: rgba(235, 238, 255, 0.9);
|
| 897 |
+
color: var(--accent-strong);
|
| 898 |
+
display: inline-flex;
|
| 899 |
+
align-items: center;
|
| 900 |
+
font-size: 0.82rem;
|
| 901 |
+
font-weight: 700;
|
| 902 |
+
line-height: 1;
|
| 903 |
+
}
|
| 904 |
+
|
| 905 |
+
.result-meta span:nth-child(4n + 2) {
|
| 906 |
+
background: rgba(255, 235, 240, 0.9);
|
| 907 |
+
color: #b44c70;
|
| 908 |
+
}
|
| 909 |
+
|
| 910 |
+
.result-meta span:nth-child(4n + 3) {
|
| 911 |
+
background: rgba(255, 243, 223, 0.95);
|
| 912 |
+
color: #9f6e19;
|
| 913 |
+
}
|
| 914 |
+
|
| 915 |
+
.result-meta span:nth-child(4n + 4) {
|
| 916 |
+
background: rgba(234, 246, 255, 0.95);
|
| 917 |
+
color: #356c97;
|
| 918 |
+
}
|
| 919 |
+
|
| 920 |
+
.result-meta span + span::before {
|
| 921 |
+
content: none;
|
| 922 |
+
}
|
| 923 |
+
|
| 924 |
+
.result-section {
|
| 925 |
+
padding: 0;
|
| 926 |
+
border: 0;
|
| 927 |
+
border-radius: 0;
|
| 928 |
+
background: transparent;
|
| 929 |
+
}
|
| 930 |
+
|
| 931 |
+
.result-section + .result-section {
|
| 932 |
+
margin-top: 18px;
|
| 933 |
+
padding-top: 18px;
|
| 934 |
+
border-top: 1px solid rgba(132, 141, 231, 0.14);
|
| 935 |
+
}
|
| 936 |
+
|
| 937 |
+
.result-source-title,
|
| 938 |
+
.result-questions-title {
|
| 939 |
+
margin: 0;
|
| 940 |
+
min-height: 32px;
|
| 941 |
+
padding: 0 14px;
|
| 942 |
+
border-radius: 999px;
|
| 943 |
+
display: inline-flex;
|
| 944 |
+
align-items: center;
|
| 945 |
+
font-size: 0.78rem;
|
| 946 |
+
font-weight: 800;
|
| 947 |
+
letter-spacing: 0.04em;
|
| 948 |
+
}
|
| 949 |
+
|
| 950 |
+
.result-source-title {
|
| 951 |
+
background: rgba(236, 238, 255, 0.92);
|
| 952 |
+
color: var(--accent-strong);
|
| 953 |
+
}
|
| 954 |
+
|
| 955 |
+
.result-questions-title {
|
| 956 |
+
background: rgba(255, 244, 225, 0.96);
|
| 957 |
+
color: #9f6e19;
|
| 958 |
+
}
|
| 959 |
+
|
| 960 |
+
.result-section-head {
|
| 961 |
+
display: flex;
|
| 962 |
+
align-items: center;
|
| 963 |
+
justify-content: space-between;
|
| 964 |
+
gap: 12px;
|
| 965 |
+
margin-bottom: 10px;
|
| 966 |
+
}
|
| 967 |
+
|
| 968 |
+
.copy-button {
|
| 969 |
+
width: auto;
|
| 970 |
+
height: auto;
|
| 971 |
+
padding: 4px;
|
| 972 |
+
border: 0;
|
| 973 |
+
border-radius: 10px;
|
| 974 |
+
background: transparent;
|
| 975 |
+
color: #68639a;
|
| 976 |
+
display: inline-grid;
|
| 977 |
+
place-items: center;
|
| 978 |
+
cursor: pointer;
|
| 979 |
+
transition: background 0.18s ease, color 0.18s ease;
|
| 980 |
+
}
|
| 981 |
+
|
| 982 |
+
.copy-button:hover {
|
| 983 |
+
background: rgba(235, 238, 255, 0.78);
|
| 984 |
+
color: var(--accent-strong);
|
| 985 |
+
}
|
| 986 |
+
|
| 987 |
+
.copy-button.is-copied {
|
| 988 |
+
background: rgba(226, 244, 233, 0.92);
|
| 989 |
+
color: #2f8a54;
|
| 990 |
+
}
|
| 991 |
+
|
| 992 |
+
.copy-button svg {
|
| 993 |
+
width: 17px;
|
| 994 |
+
height: 17px;
|
| 995 |
+
}
|
| 996 |
+
|
| 997 |
+
.result-source,
|
| 998 |
+
.result-note,
|
| 999 |
+
.result-message {
|
| 1000 |
+
margin: 0;
|
| 1001 |
+
padding: 0;
|
| 1002 |
+
line-height: 1.8;
|
| 1003 |
+
}
|
| 1004 |
+
|
| 1005 |
+
.result-source {
|
| 1006 |
+
background: transparent;
|
| 1007 |
+
border: 0;
|
| 1008 |
+
white-space: pre-wrap;
|
| 1009 |
+
}
|
| 1010 |
+
|
| 1011 |
+
.result-questions {
|
| 1012 |
+
display: grid;
|
| 1013 |
+
gap: 0;
|
| 1014 |
+
list-style: none;
|
| 1015 |
+
padding: 0;
|
| 1016 |
+
margin: 0;
|
| 1017 |
+
counter-reset: question;
|
| 1018 |
+
}
|
| 1019 |
+
|
| 1020 |
+
.result-questions li {
|
| 1021 |
+
position: relative;
|
| 1022 |
+
padding: 12px 0 12px 32px;
|
| 1023 |
+
background: transparent;
|
| 1024 |
+
border: 0;
|
| 1025 |
+
line-height: 1.75;
|
| 1026 |
+
counter-increment: question;
|
| 1027 |
+
}
|
| 1028 |
+
|
| 1029 |
+
.result-questions li + li {
|
| 1030 |
+
border-top: 1px solid rgba(132, 141, 231, 0.12);
|
| 1031 |
+
}
|
| 1032 |
+
|
| 1033 |
+
.result-questions li::before {
|
| 1034 |
+
content: counter(question) ".";
|
| 1035 |
+
position: absolute;
|
| 1036 |
+
left: 0;
|
| 1037 |
+
top: 12px;
|
| 1038 |
+
width: auto;
|
| 1039 |
+
height: auto;
|
| 1040 |
+
border-radius: 0;
|
| 1041 |
+
background: transparent;
|
| 1042 |
+
color: var(--accent-strong);
|
| 1043 |
+
display: inline-block;
|
| 1044 |
+
font-size: 0.84rem;
|
| 1045 |
+
font-weight: 700;
|
| 1046 |
+
}
|
| 1047 |
+
|
| 1048 |
+
.result-note {
|
| 1049 |
+
color: #726c9a;
|
| 1050 |
+
}
|
| 1051 |
+
|
| 1052 |
+
.result-pending {
|
| 1053 |
+
display: flex;
|
| 1054 |
+
align-items: center;
|
| 1055 |
+
gap: 14px;
|
| 1056 |
+
padding: 4px 0;
|
| 1057 |
+
background: transparent;
|
| 1058 |
+
}
|
| 1059 |
+
|
| 1060 |
+
.result-pending .result-note {
|
| 1061 |
+
padding: 0;
|
| 1062 |
+
background: transparent;
|
| 1063 |
+
}
|
| 1064 |
+
|
| 1065 |
+
.result-pending .atom-orbit-a {
|
| 1066 |
+
animation: atom-orbit-a-spin 1.75s linear infinite;
|
| 1067 |
+
}
|
| 1068 |
+
|
| 1069 |
+
.result-pending .atom-orbit-b {
|
| 1070 |
+
animation: atom-orbit-b-spin 1.2s linear infinite;
|
| 1071 |
+
}
|
| 1072 |
+
|
| 1073 |
+
.result-pending .atom-orbit-c {
|
| 1074 |
+
animation: atom-orbit-c-spin 2.25s linear infinite;
|
| 1075 |
+
}
|
| 1076 |
+
|
| 1077 |
+
.result-pending .atom-core {
|
| 1078 |
+
animation: atom-core-pulse 1s ease-in-out infinite alternate;
|
| 1079 |
+
}
|
| 1080 |
+
|
| 1081 |
+
.result-message {
|
| 1082 |
+
color: #9f3d61;
|
| 1083 |
+
}
|
| 1084 |
+
|
| 1085 |
+
.result-message-inline {
|
| 1086 |
+
margin: 0 0 14px;
|
| 1087 |
+
}
|
| 1088 |
+
|
| 1089 |
+
.landing-panel {
|
| 1090 |
+
margin: 0 18px 18px;
|
| 1091 |
+
display: grid;
|
| 1092 |
+
grid-template-columns: minmax(0, 1fr) minmax(0, 1.2fr) minmax(280px, 0.95fr);
|
| 1093 |
+
gap: 16px;
|
| 1094 |
+
}
|
| 1095 |
+
|
| 1096 |
+
.landing-card {
|
| 1097 |
+
padding: 20px 20px 18px;
|
| 1098 |
+
border-radius: 24px;
|
| 1099 |
+
background: linear-gradient(180deg, rgba(250, 249, 255, 0.98), rgba(255, 255, 255, 0.94));
|
| 1100 |
+
border: 1px solid rgba(143, 154, 238, 0.16);
|
| 1101 |
+
box-shadow: 0 14px 28px rgba(92, 99, 231, 0.06);
|
| 1102 |
+
display: grid;
|
| 1103 |
+
gap: 14px;
|
| 1104 |
+
}
|
| 1105 |
+
|
| 1106 |
+
.landing-card-head {
|
| 1107 |
+
padding: 14px 16px;
|
| 1108 |
+
border-radius: 18px;
|
| 1109 |
+
border: 1px solid rgba(146, 156, 239, 0.14);
|
| 1110 |
+
background: rgba(255, 255, 255, 0.78);
|
| 1111 |
+
box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.72);
|
| 1112 |
+
display: flex;
|
| 1113 |
+
align-items: start;
|
| 1114 |
+
justify-content: space-between;
|
| 1115 |
+
gap: 12px;
|
| 1116 |
+
flex-wrap: wrap;
|
| 1117 |
+
}
|
| 1118 |
+
|
| 1119 |
+
.landing-kicker,
|
| 1120 |
+
.landing-card-note,
|
| 1121 |
+
.landing-guide-list {
|
| 1122 |
+
margin: 0;
|
| 1123 |
+
}
|
| 1124 |
+
|
| 1125 |
+
.landing-kicker {
|
| 1126 |
+
font-size: 0.8rem;
|
| 1127 |
+
font-weight: 800;
|
| 1128 |
+
letter-spacing: 0.04em;
|
| 1129 |
+
color: #7b75a6;
|
| 1130 |
+
flex: none;
|
| 1131 |
+
}
|
| 1132 |
+
|
| 1133 |
+
.landing-card-note {
|
| 1134 |
+
font-size: 0.88rem;
|
| 1135 |
+
line-height: 1.65;
|
| 1136 |
+
color: #807aa8;
|
| 1137 |
+
flex: 1 1 220px;
|
| 1138 |
+
min-width: 0;
|
| 1139 |
+
}
|
| 1140 |
+
|
| 1141 |
+
.landing-samples .landing-card-head {
|
| 1142 |
+
display: grid;
|
| 1143 |
+
grid-template-columns: 1fr;
|
| 1144 |
+
align-items: start;
|
| 1145 |
+
gap: 8px;
|
| 1146 |
+
}
|
| 1147 |
+
|
| 1148 |
+
.landing-samples .landing-kicker {
|
| 1149 |
+
white-space: nowrap;
|
| 1150 |
+
}
|
| 1151 |
+
|
| 1152 |
+
.landing-samples .landing-card-note {
|
| 1153 |
+
white-space: normal;
|
| 1154 |
+
}
|
| 1155 |
+
|
| 1156 |
+
.landing-samples .landing-card-note {
|
| 1157 |
+
font-size: 0.84rem;
|
| 1158 |
+
line-height: 1.4;
|
| 1159 |
+
}
|
| 1160 |
+
|
| 1161 |
+
.landing-guide-list {
|
| 1162 |
+
padding: 0;
|
| 1163 |
+
list-style: none;
|
| 1164 |
+
counter-reset: landing-step;
|
| 1165 |
+
display: grid;
|
| 1166 |
+
gap: 10px;
|
| 1167 |
+
color: #4f4a79;
|
| 1168 |
+
line-height: 1.65;
|
| 1169 |
+
}
|
| 1170 |
+
|
| 1171 |
+
.landing-guide-list li {
|
| 1172 |
+
position: relative;
|
| 1173 |
+
padding: 14px 16px 14px 50px;
|
| 1174 |
+
border-radius: 18px;
|
| 1175 |
+
border: 1px solid rgba(146, 156, 239, 0.14);
|
| 1176 |
+
background: rgba(255, 255, 255, 0.78);
|
| 1177 |
+
counter-increment: landing-step;
|
| 1178 |
+
}
|
| 1179 |
+
|
| 1180 |
+
.landing-guide-list li::before {
|
| 1181 |
+
content: counter(landing-step);
|
| 1182 |
+
position: absolute;
|
| 1183 |
+
left: 16px;
|
| 1184 |
+
top: 14px;
|
| 1185 |
+
width: 22px;
|
| 1186 |
+
height: 22px;
|
| 1187 |
+
border-radius: 999px;
|
| 1188 |
+
background: rgba(92, 99, 231, 0.12);
|
| 1189 |
+
color: var(--accent-strong);
|
| 1190 |
+
display: inline-grid;
|
| 1191 |
+
place-items: center;
|
| 1192 |
+
font-size: 0.76rem;
|
| 1193 |
+
font-weight: 800;
|
| 1194 |
+
line-height: 1;
|
| 1195 |
+
}
|
| 1196 |
+
|
| 1197 |
+
.landing-guide-list strong {
|
| 1198 |
+
color: var(--accent-strong);
|
| 1199 |
+
}
|
| 1200 |
+
|
| 1201 |
+
.landing-sample-grid {
|
| 1202 |
+
display: grid;
|
| 1203 |
+
gap: 12px;
|
| 1204 |
+
}
|
| 1205 |
+
|
| 1206 |
+
.sample-card {
|
| 1207 |
+
width: 100%;
|
| 1208 |
+
padding: 14px 16px;
|
| 1209 |
+
border: 1px solid rgba(146, 156, 239, 0.16);
|
| 1210 |
+
border-radius: 18px;
|
| 1211 |
+
background: rgba(255, 255, 255, 0.84);
|
| 1212 |
+
color: inherit;
|
| 1213 |
+
display: grid;
|
| 1214 |
+
gap: 6px;
|
| 1215 |
+
text-align: left;
|
| 1216 |
+
cursor: pointer;
|
| 1217 |
+
transition: transform 0.18s ease, box-shadow 0.18s ease, border-color 0.18s ease, background 0.18s ease;
|
| 1218 |
+
}
|
| 1219 |
+
|
| 1220 |
+
.sample-card:hover {
|
| 1221 |
+
transform: translateY(-1px);
|
| 1222 |
+
border-color: rgba(92, 99, 231, 0.24);
|
| 1223 |
+
background: rgba(255, 255, 255, 0.96);
|
| 1224 |
+
box-shadow: 0 12px 22px rgba(92, 99, 231, 0.08);
|
| 1225 |
+
}
|
| 1226 |
+
|
| 1227 |
+
.sample-card strong,
|
| 1228 |
+
.sample-card span {
|
| 1229 |
+
display: block;
|
| 1230 |
+
}
|
| 1231 |
+
|
| 1232 |
+
.sample-card strong {
|
| 1233 |
+
font-size: 0.95rem;
|
| 1234 |
+
color: #474271;
|
| 1235 |
+
}
|
| 1236 |
+
|
| 1237 |
+
.sample-card span {
|
| 1238 |
+
font-size: 0.84rem;
|
| 1239 |
+
line-height: 1.55;
|
| 1240 |
+
color: #7d78a6;
|
| 1241 |
+
}
|
| 1242 |
+
|
| 1243 |
+
.landing-runtime-badge {
|
| 1244 |
+
min-height: 30px;
|
| 1245 |
+
padding: 0 12px;
|
| 1246 |
+
border-radius: 999px;
|
| 1247 |
+
display: inline-flex;
|
| 1248 |
+
align-items: center;
|
| 1249 |
+
flex: none;
|
| 1250 |
+
font-size: 0.78rem;
|
| 1251 |
+
font-weight: 800;
|
| 1252 |
+
letter-spacing: 0.02em;
|
| 1253 |
+
}
|
| 1254 |
+
|
| 1255 |
+
.landing-runtime-badge.is-ready {
|
| 1256 |
+
background: rgba(227, 244, 233, 0.92);
|
| 1257 |
+
color: #2f8a54;
|
| 1258 |
+
}
|
| 1259 |
+
|
| 1260 |
+
.landing-runtime-badge.is-pending {
|
| 1261 |
+
background: rgba(236, 238, 255, 0.92);
|
| 1262 |
+
color: var(--accent-strong);
|
| 1263 |
+
}
|
| 1264 |
+
|
| 1265 |
+
.landing-runtime-badge.is-error {
|
| 1266 |
+
background: rgba(255, 235, 240, 0.92);
|
| 1267 |
+
color: #b44c70;
|
| 1268 |
+
}
|
| 1269 |
+
|
| 1270 |
+
.landing-system > .landing-card-note {
|
| 1271 |
+
padding: 14px 16px;
|
| 1272 |
+
border-radius: 18px;
|
| 1273 |
+
border: 1px solid rgba(146, 156, 239, 0.14);
|
| 1274 |
+
background: rgba(255, 255, 255, 0.78);
|
| 1275 |
+
}
|
| 1276 |
+
|
| 1277 |
+
.landing-system-list {
|
| 1278 |
+
display: grid;
|
| 1279 |
+
gap: 10px;
|
| 1280 |
+
}
|
| 1281 |
+
|
| 1282 |
+
.landing-system-row {
|
| 1283 |
+
display: grid;
|
| 1284 |
+
gap: 4px;
|
| 1285 |
+
padding: 14px 16px;
|
| 1286 |
+
border: 1px solid rgba(146, 156, 239, 0.14);
|
| 1287 |
+
border-radius: 18px;
|
| 1288 |
+
background: rgba(255, 255, 255, 0.78);
|
| 1289 |
+
}
|
| 1290 |
+
|
| 1291 |
+
.landing-system-row:first-child {
|
| 1292 |
+
padding-top: 14px;
|
| 1293 |
+
}
|
| 1294 |
+
|
| 1295 |
+
.landing-system-row span,
|
| 1296 |
+
.landing-system-row strong {
|
| 1297 |
+
display: block;
|
| 1298 |
+
}
|
| 1299 |
+
|
| 1300 |
+
.landing-system-row span {
|
| 1301 |
+
font-size: 0.76rem;
|
| 1302 |
+
font-weight: 700;
|
| 1303 |
+
letter-spacing: 0.03em;
|
| 1304 |
+
color: #8b86b2;
|
| 1305 |
+
}
|
| 1306 |
+
|
| 1307 |
+
.landing-system-row strong {
|
| 1308 |
+
font-size: 0.96rem;
|
| 1309 |
+
line-height: 1.45;
|
| 1310 |
+
color: #474271;
|
| 1311 |
+
overflow-wrap: anywhere;
|
| 1312 |
+
word-break: break-word;
|
| 1313 |
+
}
|
| 1314 |
+
|
| 1315 |
+
.composer {
|
| 1316 |
+
margin: 0 18px 18px;
|
| 1317 |
+
padding: 20px 22px;
|
| 1318 |
+
border-radius: 28px;
|
| 1319 |
+
background: linear-gradient(180deg, rgba(247, 248, 255, 0.96), rgba(255, 255, 255, 0.94));
|
| 1320 |
+
border-color: rgba(143, 154, 238, 0.22);
|
| 1321 |
+
}
|
| 1322 |
+
|
| 1323 |
+
.input-shell {
|
| 1324 |
+
min-width: 0;
|
| 1325 |
+
display: flex;
|
| 1326 |
+
flex-direction: column;
|
| 1327 |
+
align-items: stretch;
|
| 1328 |
+
gap: 18px;
|
| 1329 |
+
min-height: 128px;
|
| 1330 |
+
padding: 0;
|
| 1331 |
+
border: 0;
|
| 1332 |
+
background: transparent;
|
| 1333 |
+
transition:
|
| 1334 |
+
min-height 0.22s ease,
|
| 1335 |
+
padding 0.22s ease;
|
| 1336 |
+
}
|
| 1337 |
+
|
| 1338 |
+
.input-shell.is-expanded {
|
| 1339 |
+
min-height: 172px;
|
| 1340 |
+
}
|
| 1341 |
+
|
| 1342 |
+
.visually-hidden {
|
| 1343 |
+
position: absolute;
|
| 1344 |
+
width: 1px;
|
| 1345 |
+
height: 1px;
|
| 1346 |
+
padding: 0;
|
| 1347 |
+
margin: -1px;
|
| 1348 |
+
overflow: hidden;
|
| 1349 |
+
clip: rect(0, 0, 0, 0);
|
| 1350 |
+
white-space: nowrap;
|
| 1351 |
+
border: 0;
|
| 1352 |
+
}
|
| 1353 |
+
|
| 1354 |
+
.input-shell textarea {
|
| 1355 |
+
width: 100%;
|
| 1356 |
+
min-height: 30px;
|
| 1357 |
+
max-height: 240px;
|
| 1358 |
+
border: 0;
|
| 1359 |
+
outline: none;
|
| 1360 |
+
resize: none;
|
| 1361 |
+
background: transparent;
|
| 1362 |
+
color: #4b4670;
|
| 1363 |
+
line-height: 1.72;
|
| 1364 |
+
padding: 0;
|
| 1365 |
+
transition: height 0.18s ease;
|
| 1366 |
+
}
|
| 1367 |
+
|
| 1368 |
+
.input-shell textarea::placeholder {
|
| 1369 |
+
color: #8e88b6;
|
| 1370 |
+
}
|
| 1371 |
+
|
| 1372 |
+
.voice-status {
|
| 1373 |
+
min-height: 0;
|
| 1374 |
+
font-size: 0.82rem;
|
| 1375 |
+
color: #7f7aa8;
|
| 1376 |
+
line-height: 1.5;
|
| 1377 |
+
max-width: min(100%, 420px);
|
| 1378 |
+
text-align: right;
|
| 1379 |
+
}
|
| 1380 |
+
|
| 1381 |
+
.voice-status.is-empty {
|
| 1382 |
+
display: none;
|
| 1383 |
+
}
|
| 1384 |
+
|
| 1385 |
+
.voice-status.is-active {
|
| 1386 |
+
color: var(--accent-strong);
|
| 1387 |
+
}
|
| 1388 |
+
|
| 1389 |
+
.voice-status.is-error {
|
| 1390 |
+
color: #b44c70;
|
| 1391 |
+
}
|
| 1392 |
+
|
| 1393 |
+
.voice-button {
|
| 1394 |
+
width: 42px;
|
| 1395 |
+
min-width: 42px;
|
| 1396 |
+
min-height: 42px;
|
| 1397 |
+
padding: 0;
|
| 1398 |
+
border: 1px solid rgba(128, 138, 235, 0.2);
|
| 1399 |
+
border-radius: 999px;
|
| 1400 |
+
background: rgba(240, 242, 255, 0.82);
|
| 1401 |
+
color: #5f5aa0;
|
| 1402 |
+
display: inline-flex;
|
| 1403 |
+
align-items: center;
|
| 1404 |
+
justify-content: center;
|
| 1405 |
+
cursor: pointer;
|
| 1406 |
+
transition:
|
| 1407 |
+
transform 0.18s ease,
|
| 1408 |
+
box-shadow 0.18s ease,
|
| 1409 |
+
border-color 0.18s ease,
|
| 1410 |
+
background 0.18s ease,
|
| 1411 |
+
color 0.18s ease;
|
| 1412 |
+
}
|
| 1413 |
+
|
| 1414 |
+
.voice-button:hover:not(:disabled) {
|
| 1415 |
+
transform: translateY(-1px);
|
| 1416 |
+
border-color: rgba(92, 99, 231, 0.3);
|
| 1417 |
+
background: rgba(235, 238, 255, 0.94);
|
| 1418 |
+
box-shadow: 0 10px 20px rgba(92, 99, 231, 0.08);
|
| 1419 |
+
}
|
| 1420 |
+
|
| 1421 |
+
.voice-button.is-listening {
|
| 1422 |
+
border-color: rgba(224, 96, 124, 0.28);
|
| 1423 |
+
background: rgba(255, 235, 240, 0.92);
|
| 1424 |
+
color: #b44c70;
|
| 1425 |
+
box-shadow: 0 0 0 6px rgba(224, 96, 124, 0.08);
|
| 1426 |
+
}
|
| 1427 |
+
|
| 1428 |
+
.voice-button:disabled,
|
| 1429 |
+
.voice-button.is-unsupported {
|
| 1430 |
+
opacity: 0.56;
|
| 1431 |
+
cursor: not-allowed;
|
| 1432 |
+
box-shadow: none;
|
| 1433 |
+
}
|
| 1434 |
+
|
| 1435 |
+
.voice-button-icon {
|
| 1436 |
+
width: 18px;
|
| 1437 |
+
height: 18px;
|
| 1438 |
+
display: inline-flex;
|
| 1439 |
+
}
|
| 1440 |
+
|
| 1441 |
+
.voice-button-icon svg {
|
| 1442 |
+
width: 100%;
|
| 1443 |
+
height: 100%;
|
| 1444 |
+
}
|
| 1445 |
+
|
| 1446 |
+
.composer-actions {
|
| 1447 |
+
display: flex;
|
| 1448 |
+
align-items: end;
|
| 1449 |
+
justify-content: space-between;
|
| 1450 |
+
gap: 14px;
|
| 1451 |
+
padding-top: 16px;
|
| 1452 |
+
border-top: 1px solid rgba(132, 141, 231, 0.12);
|
| 1453 |
+
}
|
| 1454 |
+
|
| 1455 |
+
.action-cluster {
|
| 1456 |
+
min-width: 0;
|
| 1457 |
+
margin-left: auto;
|
| 1458 |
+
display: grid;
|
| 1459 |
+
justify-items: end;
|
| 1460 |
+
gap: 10px;
|
| 1461 |
+
}
|
| 1462 |
+
|
| 1463 |
+
.action-buttons {
|
| 1464 |
+
display: flex;
|
| 1465 |
+
align-items: center;
|
| 1466 |
+
justify-content: flex-end;
|
| 1467 |
+
gap: 10px;
|
| 1468 |
+
min-width: 0;
|
| 1469 |
+
}
|
| 1470 |
+
|
| 1471 |
+
.count-shell {
|
| 1472 |
+
min-height: 0;
|
| 1473 |
+
padding: 0;
|
| 1474 |
+
display: inline-grid;
|
| 1475 |
+
gap: 8px;
|
| 1476 |
+
color: #6a6493;
|
| 1477 |
+
white-space: nowrap;
|
| 1478 |
+
}
|
| 1479 |
+
|
| 1480 |
+
.count-label {
|
| 1481 |
+
font-size: 0.78rem;
|
| 1482 |
+
font-weight: 500;
|
| 1483 |
+
color: #7a75a6;
|
| 1484 |
+
}
|
| 1485 |
+
|
| 1486 |
+
.count-stepper {
|
| 1487 |
+
display: inline-grid;
|
| 1488 |
+
grid-template-columns: 42px minmax(62px, auto) 42px;
|
| 1489 |
+
align-items: center;
|
| 1490 |
+
gap: 8px;
|
| 1491 |
+
}
|
| 1492 |
+
|
| 1493 |
+
.count-button {
|
| 1494 |
+
width: 42px;
|
| 1495 |
+
height: 42px;
|
| 1496 |
+
border: 0;
|
| 1497 |
+
border-radius: 12px;
|
| 1498 |
+
background: linear-gradient(180deg, rgba(231, 235, 255, 0.98), rgba(246, 244, 255, 0.96));
|
| 1499 |
+
color: var(--accent-strong);
|
| 1500 |
+
display: inline-grid;
|
| 1501 |
+
place-items: center;
|
| 1502 |
+
font-weight: 700;
|
| 1503 |
+
cursor: pointer;
|
| 1504 |
+
transition: transform 0.18s ease, box-shadow 0.18s ease, opacity 0.18s ease;
|
| 1505 |
+
}
|
| 1506 |
+
|
| 1507 |
+
.count-button:hover:not(:disabled) {
|
| 1508 |
+
transform: translateY(-1px);
|
| 1509 |
+
box-shadow: 0 10px 20px rgba(92, 99, 231, 0.14);
|
| 1510 |
+
}
|
| 1511 |
+
|
| 1512 |
+
.count-button:disabled {
|
| 1513 |
+
opacity: 0.42;
|
| 1514 |
+
cursor: default;
|
| 1515 |
+
}
|
| 1516 |
+
|
| 1517 |
+
.count-button span {
|
| 1518 |
+
font-size: 1.3rem;
|
| 1519 |
+
line-height: 1;
|
| 1520 |
+
}
|
| 1521 |
+
|
| 1522 |
+
.count-value {
|
| 1523 |
+
min-width: 62px;
|
| 1524 |
+
min-height: 42px;
|
| 1525 |
+
border-radius: 12px;
|
| 1526 |
+
background: rgba(240, 242, 255, 0.7);
|
| 1527 |
+
color: #4f4a7a;
|
| 1528 |
+
display: inline-grid;
|
| 1529 |
+
place-items: center;
|
| 1530 |
+
font-weight: 800;
|
| 1531 |
+
}
|
| 1532 |
+
|
| 1533 |
+
.generate-button {
|
| 1534 |
+
min-width: 214px;
|
| 1535 |
+
min-height: 58px;
|
| 1536 |
+
padding: 0 20px 0 62px;
|
| 1537 |
+
border-radius: 16px;
|
| 1538 |
+
border: 0;
|
| 1539 |
+
background: linear-gradient(135deg, var(--accent), var(--warm));
|
| 1540 |
+
box-shadow: 0 14px 30px rgba(93, 99, 190, 0.22);
|
| 1541 |
+
color: #fff;
|
| 1542 |
+
position: relative;
|
| 1543 |
+
overflow: hidden;
|
| 1544 |
+
display: inline-flex;
|
| 1545 |
+
align-items: center;
|
| 1546 |
+
justify-content: center;
|
| 1547 |
+
gap: 12px;
|
| 1548 |
+
cursor: pointer;
|
| 1549 |
+
transition: transform 0.18s ease, box-shadow 0.18s ease;
|
| 1550 |
+
}
|
| 1551 |
+
|
| 1552 |
+
.generate-button:hover:not(:disabled) {
|
| 1553 |
+
transform: translateY(-1px);
|
| 1554 |
+
box-shadow: 0 18px 34px rgba(93, 99, 190, 0.26);
|
| 1555 |
+
}
|
| 1556 |
+
|
| 1557 |
+
.generate-button:disabled {
|
| 1558 |
+
cursor: wait;
|
| 1559 |
+
}
|
| 1560 |
+
|
| 1561 |
+
.generate-button::before {
|
| 1562 |
+
content: "";
|
| 1563 |
+
position: absolute;
|
| 1564 |
+
inset: 1px;
|
| 1565 |
+
border-radius: inherit;
|
| 1566 |
+
background: linear-gradient(135deg, rgba(255, 255, 255, 0.12), transparent 48%, rgba(255, 255, 255, 0.1));
|
| 1567 |
+
}
|
| 1568 |
+
|
| 1569 |
+
.generate-button .atom-loader {
|
| 1570 |
+
--atom-border: rgba(255, 255, 255, 0.34);
|
| 1571 |
+
--atom-glow: rgba(255, 255, 255, 0.28);
|
| 1572 |
+
opacity: 0.94;
|
| 1573 |
+
transition: transform 0.24s ease, opacity 0.24s ease, filter 0.24s ease;
|
| 1574 |
+
}
|
| 1575 |
+
|
| 1576 |
+
.generate-button .atom-core {
|
| 1577 |
+
background: radial-gradient(circle at 35% 35%, #ffffff 0%, #ffe7ed 30%, #ffc9d4 58%, #ffffff 100%);
|
| 1578 |
+
box-shadow:
|
| 1579 |
+
0 0 0 calc(var(--atom-size) * 0.085) rgba(255, 255, 255, 0.16),
|
| 1580 |
+
0 0 calc(var(--atom-size) * 0.26) rgba(255, 255, 255, 0.22);
|
| 1581 |
+
}
|
| 1582 |
+
|
| 1583 |
+
.generate-button.is-loading .atom-loader {
|
| 1584 |
+
opacity: 1;
|
| 1585 |
+
filter: drop-shadow(0 0 14px rgba(255, 255, 255, 0.32));
|
| 1586 |
+
}
|
| 1587 |
+
|
| 1588 |
+
.button-label {
|
| 1589 |
+
font-size: 0.96rem;
|
| 1590 |
+
font-weight: 700;
|
| 1591 |
+
position: relative;
|
| 1592 |
+
z-index: 1;
|
| 1593 |
+
}
|
| 1594 |
+
|
| 1595 |
+
@keyframes result-card-in {
|
| 1596 |
+
from {
|
| 1597 |
+
opacity: 0;
|
| 1598 |
+
transform: translateY(16px);
|
| 1599 |
+
}
|
| 1600 |
+
|
| 1601 |
+
to {
|
| 1602 |
+
opacity: 1;
|
| 1603 |
+
transform: translateY(0);
|
| 1604 |
+
}
|
| 1605 |
+
}
|
| 1606 |
+
|
| 1607 |
+
@keyframes atom-core-pulse {
|
| 1608 |
+
from {
|
| 1609 |
+
transform: scale(0.92);
|
| 1610 |
+
}
|
| 1611 |
+
|
| 1612 |
+
to {
|
| 1613 |
+
transform: scale(1.1);
|
| 1614 |
+
}
|
| 1615 |
+
}
|
| 1616 |
+
|
| 1617 |
+
@keyframes atom-orbit-a-spin {
|
| 1618 |
+
from {
|
| 1619 |
+
transform: rotate(10deg);
|
| 1620 |
+
}
|
| 1621 |
+
|
| 1622 |
+
to {
|
| 1623 |
+
transform: rotate(370deg);
|
| 1624 |
+
}
|
| 1625 |
+
}
|
| 1626 |
+
|
| 1627 |
+
@keyframes atom-orbit-b-spin {
|
| 1628 |
+
from {
|
| 1629 |
+
transform: rotate(72deg);
|
| 1630 |
+
}
|
| 1631 |
+
|
| 1632 |
+
to {
|
| 1633 |
+
transform: rotate(-288deg);
|
| 1634 |
+
}
|
| 1635 |
+
}
|
| 1636 |
+
|
| 1637 |
+
@keyframes atom-orbit-c-spin {
|
| 1638 |
+
from {
|
| 1639 |
+
transform: rotate(-58deg);
|
| 1640 |
+
}
|
| 1641 |
+
|
| 1642 |
+
to {
|
| 1643 |
+
transform: rotate(302deg);
|
| 1644 |
+
}
|
| 1645 |
+
}
|
| 1646 |
+
|
| 1647 |
+
@keyframes title-sheen {
|
| 1648 |
+
0% {
|
| 1649 |
+
transform: translateX(-135%) skewX(-18deg);
|
| 1650 |
+
}
|
| 1651 |
+
|
| 1652 |
+
100% {
|
| 1653 |
+
transform: translateX(135%) skewX(-18deg);
|
| 1654 |
+
}
|
| 1655 |
+
}
|
| 1656 |
+
|
| 1657 |
+
@media (max-width: 1180px) {
|
| 1658 |
+
.landing-panel {
|
| 1659 |
+
grid-template-columns: repeat(2, minmax(0, 1fr));
|
| 1660 |
+
}
|
| 1661 |
+
|
| 1662 |
+
.landing-system {
|
| 1663 |
+
grid-column: 1 / -1;
|
| 1664 |
+
}
|
| 1665 |
+
|
| 1666 |
+
.composer-actions {
|
| 1667 |
+
flex-direction: column;
|
| 1668 |
+
align-items: stretch;
|
| 1669 |
+
}
|
| 1670 |
+
|
| 1671 |
+
.action-cluster {
|
| 1672 |
+
width: 100%;
|
| 1673 |
+
justify-items: stretch;
|
| 1674 |
+
}
|
| 1675 |
+
|
| 1676 |
+
.action-buttons {
|
| 1677 |
+
width: 100%;
|
| 1678 |
+
justify-content: flex-start;
|
| 1679 |
+
}
|
| 1680 |
+
|
| 1681 |
+
.voice-status {
|
| 1682 |
+
max-width: none;
|
| 1683 |
+
text-align: left;
|
| 1684 |
+
}
|
| 1685 |
+
|
| 1686 |
+
.count-shell {
|
| 1687 |
+
width: 100%;
|
| 1688 |
+
}
|
| 1689 |
+
|
| 1690 |
+
.action-buttons .generate-button {
|
| 1691 |
+
width: auto;
|
| 1692 |
+
min-width: 0;
|
| 1693 |
+
flex: 1 1 auto;
|
| 1694 |
+
}
|
| 1695 |
+
}
|
| 1696 |
+
|
| 1697 |
+
@media (max-width: 900px) {
|
| 1698 |
+
.page-shell {
|
| 1699 |
+
width: 100%;
|
| 1700 |
+
min-height: 100vh;
|
| 1701 |
+
margin: 0;
|
| 1702 |
+
border-radius: 0;
|
| 1703 |
+
grid-template-columns: 1fr;
|
| 1704 |
+
}
|
| 1705 |
+
|
| 1706 |
+
body.sidebar-open .page-shell,
|
| 1707 |
+
.page-shell {
|
| 1708 |
+
grid-template-columns: 1fr;
|
| 1709 |
+
}
|
| 1710 |
+
|
| 1711 |
+
.sidebar {
|
| 1712 |
+
border-right: 0;
|
| 1713 |
+
border-bottom: 1px solid var(--line);
|
| 1714 |
+
}
|
| 1715 |
+
|
| 1716 |
+
.topbar,
|
| 1717 |
+
.hero-panel {
|
| 1718 |
+
padding-left: 20px;
|
| 1719 |
+
padding-right: 20px;
|
| 1720 |
+
}
|
| 1721 |
+
|
| 1722 |
+
.result-card {
|
| 1723 |
+
margin-left: 20px;
|
| 1724 |
+
margin-right: 20px;
|
| 1725 |
+
}
|
| 1726 |
+
|
| 1727 |
+
.landing-panel {
|
| 1728 |
+
margin-left: 12px;
|
| 1729 |
+
margin-right: 12px;
|
| 1730 |
+
grid-template-columns: 1fr;
|
| 1731 |
+
}
|
| 1732 |
+
|
| 1733 |
+
.landing-system {
|
| 1734 |
+
grid-column: auto;
|
| 1735 |
+
}
|
| 1736 |
+
|
| 1737 |
+
.landing-samples .landing-card-head {
|
| 1738 |
+
grid-template-columns: 1fr;
|
| 1739 |
+
}
|
| 1740 |
+
|
| 1741 |
+
.landing-samples .landing-kicker,
|
| 1742 |
+
.landing-samples .landing-card-note {
|
| 1743 |
+
white-space: normal;
|
| 1744 |
+
}
|
| 1745 |
+
|
| 1746 |
+
.composer {
|
| 1747 |
+
margin-left: 12px;
|
| 1748 |
+
margin-right: 12px;
|
| 1749 |
+
}
|
| 1750 |
+
}
|
| 1751 |
+
|
| 1752 |
+
@media (max-width: 760px) {
|
| 1753 |
+
.identity-copy h1,
|
| 1754 |
+
.hero-copy h2 {
|
| 1755 |
+
white-space: normal;
|
| 1756 |
+
}
|
| 1757 |
+
}
|
| 1758 |
+
|
| 1759 |
+
@media (max-width: 640px) {
|
| 1760 |
+
.topbar {
|
| 1761 |
+
padding-top: 30px;
|
| 1762 |
+
}
|
| 1763 |
+
|
| 1764 |
+
.logo {
|
| 1765 |
+
width: 68px;
|
| 1766 |
+
height: 68px;
|
| 1767 |
+
}
|
| 1768 |
+
|
| 1769 |
+
.hero-copy h2 {
|
| 1770 |
+
font-size: clamp(1.35rem, 7.4vw, 2.1rem);
|
| 1771 |
+
min-height: auto;
|
| 1772 |
+
}
|
| 1773 |
+
|
| 1774 |
+
.result-card {
|
| 1775 |
+
min-height: 180px;
|
| 1776 |
+
padding: 0;
|
| 1777 |
+
}
|
| 1778 |
+
|
| 1779 |
+
.generate-button {
|
| 1780 |
+
min-width: 0;
|
| 1781 |
+
}
|
| 1782 |
+
}
|
| 1783 |
+
|
| 1784 |
+
@media (prefers-reduced-motion: reduce) {
|
| 1785 |
+
*,
|
| 1786 |
+
*::before,
|
| 1787 |
+
*::after {
|
| 1788 |
+
animation: none !important;
|
| 1789 |
+
transition: none !important;
|
| 1790 |
+
scroll-behavior: auto !important;
|
| 1791 |
+
}
|
| 1792 |
+
}
|
HVU_QA/generate_question.py
ADDED
|
@@ -0,0 +1,383 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from __future__ import annotations
|
| 2 |
+
|
| 3 |
+
import argparse
|
| 4 |
+
import json
|
| 5 |
+
import os
|
| 6 |
+
import re
|
| 7 |
+
import sys
|
| 8 |
+
import threading
|
| 9 |
+
from pathlib import Path
|
| 10 |
+
from typing import Any
|
| 11 |
+
|
| 12 |
+
os.environ.setdefault("TOKENIZERS_PARALLELISM", "false")
|
| 13 |
+
os.environ.setdefault("PYTORCH_CUDA_ALLOC_CONF", "expandable_segments:True")
|
| 14 |
+
|
| 15 |
+
|
| 16 |
+
def raise_missing_dependency_error(exc: ModuleNotFoundError) -> None:
|
| 17 |
+
root = Path(__file__).resolve().parent
|
| 18 |
+
requirements = root / "requirements.txt"
|
| 19 |
+
message = [
|
| 20 |
+
f"Thiếu thư viện Python: {exc.name}",
|
| 21 |
+
f"Interpreter hiện tại: {sys.executable}",
|
| 22 |
+
]
|
| 23 |
+
if requirements.exists():
|
| 24 |
+
message.extend(
|
| 25 |
+
[
|
| 26 |
+
"Cài đặt dependencies bằng lệnh:",
|
| 27 |
+
f"{sys.executable} -m pip install -r {requirements}",
|
| 28 |
+
]
|
| 29 |
+
)
|
| 30 |
+
raise SystemExit("\n".join(message)) from exc
|
| 31 |
+
|
| 32 |
+
|
| 33 |
+
try:
|
| 34 |
+
import torch
|
| 35 |
+
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
| 36 |
+
except ModuleNotFoundError as exc:
|
| 37 |
+
raise_missing_dependency_error(exc)
|
| 38 |
+
|
| 39 |
+
|
| 40 |
+
APP_TITLE = "Mô hình sinh câu hỏi thường gặp"
|
| 41 |
+
TASK_PREFIX = "sinh câu hỏi"
|
| 42 |
+
QUESTION_LIMIT = 100
|
| 43 |
+
GENERATION_PASSES = (
|
| 44 |
+
(0.9, 0.95, None, 1, 4),
|
| 45 |
+
(1.0, 0.97, 16, 1, 5),
|
| 46 |
+
(1.08, 0.99, 8, 2, 6),
|
| 47 |
+
)
|
| 48 |
+
|
| 49 |
+
|
| 50 |
+
def normalize_text(text: Any) -> str:
|
| 51 |
+
return " ".join(str(text or "").split())
|
| 52 |
+
|
| 53 |
+
|
| 54 |
+
def unique_text(items: list[str]) -> list[str]:
|
| 55 |
+
seen: set[str] = set()
|
| 56 |
+
output: list[str] = []
|
| 57 |
+
for item in items:
|
| 58 |
+
value = normalize_text(item)
|
| 59 |
+
key = value.lower()
|
| 60 |
+
if key and key not in seen:
|
| 61 |
+
seen.add(key)
|
| 62 |
+
output.append(value)
|
| 63 |
+
return output
|
| 64 |
+
|
| 65 |
+
|
| 66 |
+
def parse_question_count(value: Any, default: int = 5) -> int:
|
| 67 |
+
try:
|
| 68 |
+
parsed = int(value)
|
| 69 |
+
except (TypeError, ValueError):
|
| 70 |
+
parsed = default
|
| 71 |
+
return max(1, min(parsed, QUESTION_LIMIT))
|
| 72 |
+
|
| 73 |
+
|
| 74 |
+
def format_questions(items: list[str]) -> str:
|
| 75 |
+
if not items:
|
| 76 |
+
return "Không sinh được câu hỏi phù hợp."
|
| 77 |
+
return "\n".join(f"{index}. {item}" for index, item in enumerate(items, 1))
|
| 78 |
+
|
| 79 |
+
|
| 80 |
+
def resolve_model_dir(model_dir: str | Path, prefer_nested_model: bool = True) -> Path:
|
| 81 |
+
model_root = Path(model_dir).expanduser().resolve()
|
| 82 |
+
nested_candidates = [model_root / "best-model", model_root / "final-model"]
|
| 83 |
+
candidates = [*nested_candidates, model_root] if prefer_nested_model else [model_root, *nested_candidates]
|
| 84 |
+
for candidate in candidates:
|
| 85 |
+
if candidate.is_dir() and (candidate / "config.json").exists():
|
| 86 |
+
return candidate
|
| 87 |
+
raise FileNotFoundError(f"Không tìm thấy thư mục mô hình hợp lệ: {model_root}")
|
| 88 |
+
|
| 89 |
+
|
| 90 |
+
def parse_dtype(value: str) -> torch.dtype:
|
| 91 |
+
normalized = value.strip().lower()
|
| 92 |
+
mapping = {
|
| 93 |
+
"float16": torch.float16,
|
| 94 |
+
"fp16": torch.float16,
|
| 95 |
+
"float32": torch.float32,
|
| 96 |
+
"fp32": torch.float32,
|
| 97 |
+
"bfloat16": torch.bfloat16,
|
| 98 |
+
"bf16": torch.bfloat16,
|
| 99 |
+
}
|
| 100 |
+
if normalized not in mapping:
|
| 101 |
+
raise ValueError(f"Không hỗ trợ gpu_dtype={value}")
|
| 102 |
+
return mapping[normalized]
|
| 103 |
+
|
| 104 |
+
|
| 105 |
+
class QuestionGenerator:
|
| 106 |
+
def __init__(
|
| 107 |
+
self,
|
| 108 |
+
model_dir: str | Path = "t5-viet-qg-finetuned",
|
| 109 |
+
task_prefix: str = TASK_PREFIX,
|
| 110 |
+
max_source_length: int = 512,
|
| 111 |
+
max_new_tokens: int = 64,
|
| 112 |
+
device: str = "auto",
|
| 113 |
+
cpu_threads: int | None = None,
|
| 114 |
+
gpu_dtype: str = "auto",
|
| 115 |
+
prefer_nested_model: bool = True,
|
| 116 |
+
) -> None:
|
| 117 |
+
self.model_root = Path(model_dir).expanduser().resolve()
|
| 118 |
+
self.model_dir = resolve_model_dir(model_dir, prefer_nested_model=prefer_nested_model)
|
| 119 |
+
self.task_prefix = task_prefix
|
| 120 |
+
self.max_source_length = max_source_length
|
| 121 |
+
self.max_new_tokens = max_new_tokens
|
| 122 |
+
self.requested_device = device
|
| 123 |
+
self.cpu_threads = cpu_threads
|
| 124 |
+
self.gpu_dtype = gpu_dtype
|
| 125 |
+
self.prefer_nested_model = prefer_nested_model
|
| 126 |
+
self.device: torch.device | None = None
|
| 127 |
+
self.dtype: torch.dtype | None = None
|
| 128 |
+
self.tokenizer = None
|
| 129 |
+
self.model = None
|
| 130 |
+
self._load_lock = threading.Lock()
|
| 131 |
+
|
| 132 |
+
def _resolve_device(self) -> torch.device:
|
| 133 |
+
requested = self.requested_device.lower()
|
| 134 |
+
if requested == "cpu":
|
| 135 |
+
return torch.device("cpu")
|
| 136 |
+
if requested == "cuda":
|
| 137 |
+
if not torch.cuda.is_available():
|
| 138 |
+
raise RuntimeError("Bạn đã chọn device=cuda nhưng máy hiện tại không có CUDA.")
|
| 139 |
+
return torch.device("cuda")
|
| 140 |
+
return torch.device("cuda" if torch.cuda.is_available() else "cpu")
|
| 141 |
+
|
| 142 |
+
def _resolve_dtype(self) -> torch.dtype:
|
| 143 |
+
if self.device is None or self.device.type != "cuda":
|
| 144 |
+
return torch.float32
|
| 145 |
+
if self.gpu_dtype == "auto":
|
| 146 |
+
if hasattr(torch.cuda, "is_bf16_supported") and torch.cuda.is_bf16_supported():
|
| 147 |
+
return torch.bfloat16
|
| 148 |
+
return torch.float16
|
| 149 |
+
return parse_dtype(self.gpu_dtype)
|
| 150 |
+
|
| 151 |
+
def _configure_runtime(self) -> None:
|
| 152 |
+
if self.device is None:
|
| 153 |
+
return
|
| 154 |
+
if self.device.type == "cpu":
|
| 155 |
+
if self.cpu_threads:
|
| 156 |
+
torch.set_num_threads(max(1, int(self.cpu_threads)))
|
| 157 |
+
if hasattr(torch, "set_num_interop_threads"):
|
| 158 |
+
torch.set_num_interop_threads(max(1, min(int(self.cpu_threads), 4)))
|
| 159 |
+
return
|
| 160 |
+
|
| 161 |
+
if hasattr(torch.backends, "cuda") and hasattr(torch.backends.cuda, "matmul"):
|
| 162 |
+
torch.backends.cuda.matmul.allow_tf32 = True
|
| 163 |
+
if hasattr(torch.backends, "cudnn"):
|
| 164 |
+
torch.backends.cudnn.allow_tf32 = True
|
| 165 |
+
torch.backends.cudnn.benchmark = True
|
| 166 |
+
|
| 167 |
+
def load(self) -> None:
|
| 168 |
+
if self.model is not None and self.tokenizer is not None:
|
| 169 |
+
return
|
| 170 |
+
|
| 171 |
+
with self._load_lock:
|
| 172 |
+
if self.model is not None and self.tokenizer is not None:
|
| 173 |
+
return
|
| 174 |
+
|
| 175 |
+
self.device = self._resolve_device()
|
| 176 |
+
self.dtype = self._resolve_dtype()
|
| 177 |
+
self._configure_runtime()
|
| 178 |
+
|
| 179 |
+
model_kwargs: dict[str, Any] = {}
|
| 180 |
+
if self.device.type == "cuda":
|
| 181 |
+
model_kwargs["torch_dtype"] = self.dtype
|
| 182 |
+
model_kwargs["low_cpu_mem_usage"] = True
|
| 183 |
+
|
| 184 |
+
self.tokenizer = AutoTokenizer.from_pretrained(str(self.model_dir), use_fast=True)
|
| 185 |
+
self.model = AutoModelForSeq2SeqLM.from_pretrained(str(self.model_dir), **model_kwargs)
|
| 186 |
+
self.model.to(self.device)
|
| 187 |
+
self.model.eval()
|
| 188 |
+
|
| 189 |
+
def metadata(self) -> dict[str, Any]:
|
| 190 |
+
active_device = self.device.type if self.device is not None else None
|
| 191 |
+
predicted_device = "cuda" if torch.cuda.is_available() and self.requested_device != "cpu" else "cpu"
|
| 192 |
+
return {
|
| 193 |
+
"title": APP_TITLE,
|
| 194 |
+
"model_root": str(self.model_root),
|
| 195 |
+
"model_dir": str(self.model_dir),
|
| 196 |
+
"requested_device": self.requested_device,
|
| 197 |
+
"active_device": active_device,
|
| 198 |
+
"predicted_device": predicted_device,
|
| 199 |
+
"loaded": self.model is not None,
|
| 200 |
+
"gpu_available": torch.cuda.is_available(),
|
| 201 |
+
"gpu_dtype": None if self.dtype is None else str(self.dtype).replace("torch.", ""),
|
| 202 |
+
"cpu_threads": torch.get_num_threads(),
|
| 203 |
+
}
|
| 204 |
+
|
| 205 |
+
def _candidate_answers(self, text: str, limit: int) -> list[str]:
|
| 206 |
+
text = normalize_text(text)
|
| 207 |
+
if not text:
|
| 208 |
+
return []
|
| 209 |
+
|
| 210 |
+
candidates: list[str] = []
|
| 211 |
+
split_pattern = r"(?<=[.!?])\s+|\n+"
|
| 212 |
+
for sentence in [normalize_text(part) for part in re.split(split_pattern, text) if normalize_text(part)]:
|
| 213 |
+
if 3 <= len(sentence.split()) <= 30:
|
| 214 |
+
candidates.append(sentence)
|
| 215 |
+
for clause in (normalize_text(part) for part in re.split(r"\s*[,;:]\s*", sentence)):
|
| 216 |
+
if 3 <= len(clause.split()) <= 20:
|
| 217 |
+
candidates.append(clause)
|
| 218 |
+
|
| 219 |
+
if not candidates:
|
| 220 |
+
words = text.split()
|
| 221 |
+
candidates = [" ".join(words[: min(12, len(words))])] if words else [text]
|
| 222 |
+
|
| 223 |
+
ranked = sorted(unique_text(candidates), key=lambda item: (abs(len(item.split()) - 10), len(item)))
|
| 224 |
+
return ranked[:limit]
|
| 225 |
+
|
| 226 |
+
def _build_prompt(self, context: str, answer: str) -> str:
|
| 227 |
+
return f"{self.task_prefix}:\nngữ cảnh: {context}\nđáp án: {answer}"
|
| 228 |
+
|
| 229 |
+
@torch.inference_mode()
|
| 230 |
+
def _sample(self, context: str, answer: str, count: int, temperature: float, top_p: float) -> list[str]:
|
| 231 |
+
if self.tokenizer is None or self.model is None or self.device is None:
|
| 232 |
+
raise RuntimeError("Model chưa được load.")
|
| 233 |
+
|
| 234 |
+
inputs = self.tokenizer(
|
| 235 |
+
self._build_prompt(context, answer),
|
| 236 |
+
return_tensors="pt",
|
| 237 |
+
truncation=True,
|
| 238 |
+
max_length=self.max_source_length,
|
| 239 |
+
).to(self.device)
|
| 240 |
+
outputs = self.model.generate(
|
| 241 |
+
**inputs,
|
| 242 |
+
max_new_tokens=self.max_new_tokens,
|
| 243 |
+
do_sample=True,
|
| 244 |
+
temperature=temperature,
|
| 245 |
+
top_p=top_p,
|
| 246 |
+
num_return_sequences=count,
|
| 247 |
+
no_repeat_ngram_size=3,
|
| 248 |
+
repetition_penalty=1.1,
|
| 249 |
+
)
|
| 250 |
+
questions: list[str] = []
|
| 251 |
+
for token_ids in outputs:
|
| 252 |
+
question = normalize_text(self.tokenizer.decode(token_ids, skip_special_tokens=True))
|
| 253 |
+
if question:
|
| 254 |
+
questions.append(question if question.endswith("?") else f"{question}?")
|
| 255 |
+
return [question for question in unique_text(questions) if len(question.split()) >= 3]
|
| 256 |
+
|
| 257 |
+
@torch.inference_mode()
|
| 258 |
+
def _beam_search(self, context: str, answer: str, count: int) -> list[str]:
|
| 259 |
+
if self.tokenizer is None or self.model is None or self.device is None:
|
| 260 |
+
raise RuntimeError("Model chưa được load.")
|
| 261 |
+
|
| 262 |
+
inputs = self.tokenizer(
|
| 263 |
+
self._build_prompt(context, answer),
|
| 264 |
+
return_tensors="pt",
|
| 265 |
+
truncation=True,
|
| 266 |
+
max_length=self.max_source_length,
|
| 267 |
+
).to(self.device)
|
| 268 |
+
outputs = self.model.generate(
|
| 269 |
+
**inputs,
|
| 270 |
+
max_new_tokens=self.max_new_tokens,
|
| 271 |
+
num_beams=max(4, count),
|
| 272 |
+
num_return_sequences=min(count, 4),
|
| 273 |
+
early_stopping=True,
|
| 274 |
+
no_repeat_ngram_size=3,
|
| 275 |
+
repetition_penalty=1.1,
|
| 276 |
+
)
|
| 277 |
+
questions: list[str] = []
|
| 278 |
+
for token_ids in outputs:
|
| 279 |
+
question = normalize_text(self.tokenizer.decode(token_ids, skip_special_tokens=True))
|
| 280 |
+
if question:
|
| 281 |
+
questions.append(question if question.endswith("?") else f"{question}?")
|
| 282 |
+
return [question for question in unique_text(questions) if len(question.split()) >= 3]
|
| 283 |
+
|
| 284 |
+
def generate(self, text: str, count: int = 5) -> list[str]:
|
| 285 |
+
self.load()
|
| 286 |
+
context = normalize_text(text)
|
| 287 |
+
if not context:
|
| 288 |
+
raise ValueError("Vui lòng nhập đoạn văn.")
|
| 289 |
+
|
| 290 |
+
count = parse_question_count(count)
|
| 291 |
+
pool = unique_text(
|
| 292 |
+
self._candidate_answers(context, max(32, count * 5)) + [context[:180], context[:280], context]
|
| 293 |
+
)
|
| 294 |
+
output: list[str] = []
|
| 295 |
+
seen: set[str] = set()
|
| 296 |
+
|
| 297 |
+
for temperature, top_p, limit, rounds, floor in GENERATION_PASSES:
|
| 298 |
+
answers = pool[:limit] if limit else pool
|
| 299 |
+
for _ in range(rounds):
|
| 300 |
+
for answer in answers:
|
| 301 |
+
remaining = count - len(output)
|
| 302 |
+
if remaining <= 0:
|
| 303 |
+
return output[:count]
|
| 304 |
+
sample_count = min(8, max(floor, remaining * 2))
|
| 305 |
+
for question in self._sample(context, answer, sample_count, temperature, top_p):
|
| 306 |
+
key = question.lower()
|
| 307 |
+
if key not in seen:
|
| 308 |
+
seen.add(key)
|
| 309 |
+
output.append(question)
|
| 310 |
+
if len(output) >= count:
|
| 311 |
+
return output[:count]
|
| 312 |
+
|
| 313 |
+
for answer in pool[: min(8, len(pool))]:
|
| 314 |
+
remaining = count - len(output)
|
| 315 |
+
if remaining <= 0:
|
| 316 |
+
break
|
| 317 |
+
for question in self._beam_search(context, answer, remaining):
|
| 318 |
+
key = question.lower()
|
| 319 |
+
if key not in seen:
|
| 320 |
+
seen.add(key)
|
| 321 |
+
output.append(question)
|
| 322 |
+
if len(output) >= count:
|
| 323 |
+
break
|
| 324 |
+
|
| 325 |
+
return output[:count]
|
| 326 |
+
|
| 327 |
+
|
| 328 |
+
def read_input_text(args: argparse.Namespace) -> str:
|
| 329 |
+
if args.text:
|
| 330 |
+
return args.text
|
| 331 |
+
if args.input_file:
|
| 332 |
+
return Path(args.input_file).read_text(encoding="utf-8")
|
| 333 |
+
if sys.stdin.isatty():
|
| 334 |
+
return input("Nhập đoạn văn cần sinh câu hỏi:\n").strip()
|
| 335 |
+
return sys.stdin.read().strip()
|
| 336 |
+
|
| 337 |
+
|
| 338 |
+
def build_parser() -> argparse.ArgumentParser:
|
| 339 |
+
parser = argparse.ArgumentParser(description="Sinh câu hỏi từ đoạn văn bằng model T5 fine-tuned.")
|
| 340 |
+
parser.add_argument("--model_dir", default="t5-viet-qg-finetuned")
|
| 341 |
+
parser.add_argument("--task_prefix", default=TASK_PREFIX)
|
| 342 |
+
parser.add_argument("--max_source_length", type=int, default=512)
|
| 343 |
+
parser.add_argument("--max_new_tokens", type=int, default=64)
|
| 344 |
+
parser.add_argument("--num_questions", type=int, default=100)
|
| 345 |
+
parser.add_argument("--device", choices=["auto", "cpu", "cuda"], default="auto")
|
| 346 |
+
parser.add_argument("--cpu_threads", type=int, default=None)
|
| 347 |
+
parser.add_argument("--gpu_dtype", default="auto")
|
| 348 |
+
parser.add_argument("--text", default=None)
|
| 349 |
+
parser.add_argument("--input_file", default=None)
|
| 350 |
+
parser.add_argument("--output_format", choices=["text", "json"], default="text")
|
| 351 |
+
return parser
|
| 352 |
+
|
| 353 |
+
|
| 354 |
+
def main() -> None:
|
| 355 |
+
args = build_parser().parse_args()
|
| 356 |
+
if hasattr(sys.stdout, "reconfigure"):
|
| 357 |
+
sys.stdout.reconfigure(encoding="utf-8")
|
| 358 |
+
generator = QuestionGenerator(
|
| 359 |
+
model_dir=args.model_dir,
|
| 360 |
+
task_prefix=args.task_prefix,
|
| 361 |
+
max_source_length=args.max_source_length,
|
| 362 |
+
max_new_tokens=args.max_new_tokens,
|
| 363 |
+
device=args.device,
|
| 364 |
+
cpu_threads=args.cpu_threads,
|
| 365 |
+
gpu_dtype=args.gpu_dtype,
|
| 366 |
+
prefer_nested_model=True,
|
| 367 |
+
)
|
| 368 |
+
text = read_input_text(args)
|
| 369 |
+
questions = generator.generate(text, parse_question_count(args.num_questions))
|
| 370 |
+
payload = {
|
| 371 |
+
"text": normalize_text(text),
|
| 372 |
+
"questions": questions,
|
| 373 |
+
"formatted": format_questions(questions),
|
| 374 |
+
"meta": generator.metadata(),
|
| 375 |
+
}
|
| 376 |
+
if args.output_format == "json":
|
| 377 |
+
print(json.dumps(payload, ensure_ascii=False, indent=2))
|
| 378 |
+
return
|
| 379 |
+
print(payload["formatted"])
|
| 380 |
+
|
| 381 |
+
|
| 382 |
+
if __name__ == "__main__":
|
| 383 |
+
main()
|
HVU_QA/main.py
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from __future__ import annotations
|
| 2 |
+
|
| 3 |
+
import os
|
| 4 |
+
import threading
|
| 5 |
+
import webbrowser
|
| 6 |
+
|
| 7 |
+
from backend import create_app
|
| 8 |
+
|
| 9 |
+
app = create_app()
|
| 10 |
+
|
| 11 |
+
|
| 12 |
+
def _as_bool(value: str | None, default: bool) -> bool:
|
| 13 |
+
if value is None:
|
| 14 |
+
return default
|
| 15 |
+
return value.strip().lower() not in {"0", "false", "no", "off"}
|
| 16 |
+
|
| 17 |
+
|
| 18 |
+
def _open_browser_later(host: str, port: int) -> None:
|
| 19 |
+
if not _as_bool(os.getenv("HVU_OPEN_BROWSER"), True):
|
| 20 |
+
return
|
| 21 |
+
target_host = "127.0.0.1" if host in {"0.0.0.0", "::"} else host
|
| 22 |
+
url = f"http://{target_host}:{port}"
|
| 23 |
+
threading.Timer(1.2, lambda: webbrowser.open(url)).start()
|
| 24 |
+
|
| 25 |
+
|
| 26 |
+
if __name__ == "__main__":
|
| 27 |
+
host = os.getenv("HVU_HOST", "127.0.0.1")
|
| 28 |
+
port = int(os.getenv("HVU_PORT", "5000"))
|
| 29 |
+
debug = _as_bool(os.getenv("HVU_DEBUG"), False)
|
| 30 |
+
_open_browser_later(host, port)
|
| 31 |
+
app.run(host=host, port=port, debug=debug, use_reloader=False)
|
HVU_QA/readme.md
ADDED
|
@@ -0,0 +1,392 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# HVU_QA
|
| 2 |
+
|
| 3 |
+
Ứng dụng sinh câu hỏi từ đoạn văn bản bằng mô hình T5 tiếng Việt.
|
| 4 |
+
|
| 5 |
+
Repo này hiện hỗ trợ **2 cách sử dụng**:
|
| 6 |
+
|
| 7 |
+
- **Full project**: tải toàn bộ source code để dùng, chỉnh sửa, kiểm thử và phát triển tiếp.
|
| 8 |
+
- **Standalone tool**: chỉ cần `HVU_QA_tool.py` hoặc `HVU_QA_tool.py` + `HVU_QA_tool.bat` để launcher tự dựng runtime, tự cài dependency, tự tải model và mở app.
|
| 9 |
+
|
| 10 |
+
## 1. Cấu trúc project
|
| 11 |
+
|
| 12 |
+
```text
|
| 13 |
+
HVU_QA/
|
| 14 |
+
├── backend/
|
| 15 |
+
│ ├── __init__.py
|
| 16 |
+
│ └── app.py
|
| 17 |
+
├── frontend/
|
| 18 |
+
│ ├── app.js
|
| 19 |
+
│ ├── index.html
|
| 20 |
+
│ └── style.css
|
| 21 |
+
├── t5-viet-qg-finetuned/
|
| 22 |
+
│ ├── config.json
|
| 23 |
+
│ ├── model.safetensors
|
| 24 |
+
│ ├── ...
|
| 25 |
+
│ └── best-model/
|
| 26 |
+
├── 40k_train.json
|
| 27 |
+
├── fine_tune_qg.py
|
| 28 |
+
├── generate_question.py
|
| 29 |
+
├── HVU.png
|
| 30 |
+
├── HVU_QA_end_to_end_guide.ipynb
|
| 31 |
+
├── HVU_QA_tool.py
|
| 32 |
+
├── HVU_QA_tool.bat
|
| 33 |
+
├── main.py
|
| 34 |
+
├── readme.md
|
| 35 |
+
└── requirements.txt
|
| 36 |
+
```
|
| 37 |
+
|
| 38 |
+
## 2. Thành phần chính
|
| 39 |
+
|
| 40 |
+
- `main.py`: điểm chạy chính của web app Flask trong full project.
|
| 41 |
+
- `backend/app.py`: route web và API backend của full project.
|
| 42 |
+
- `frontend/index.html`, `frontend/app.js`, `frontend/style.css`: giao diện và logic frontend của full project.
|
| 43 |
+
- `generate_question.py`: lõi load model và sinh câu hỏi, đồng thời hỗ trợ CLI.
|
| 44 |
+
- `fine_tune_qg.py`: script fine-tune model.
|
| 45 |
+
- `HVU_QA_tool.py`: launcher đa chế độ.
|
| 46 |
+
- Nếu đứng trong full project: dùng source hiện tại.
|
| 47 |
+
- Nếu chỉ có mỗi file tool: tự tạo `.hvu_qa_tool_venv/` và `HVU_QA_runtime/`.
|
| 48 |
+
- `HVU_QA_tool.bat`: file chạy nhanh cho Windows.
|
| 49 |
+
- `HVU_QA_end_to_end_guide.ipynb`: notebook đã tách riêng 2 luồng `Full project` và `Quick tool`.
|
| 50 |
+
|
| 51 |
+
## 3. Yêu cầu môi trường
|
| 52 |
+
|
| 53 |
+
Khuyến nghị:
|
| 54 |
+
|
| 55 |
+
- Python `3.11`
|
| 56 |
+
- Windows PowerShell
|
| 57 |
+
- `pip`
|
| 58 |
+
|
| 59 |
+
Nếu muốn chạy bằng GPU NVIDIA, cần cài đúng bản `torch` tương thích CUDA của máy.
|
| 60 |
+
|
| 61 |
+
## 4. Cách dùng A - Full project
|
| 62 |
+
|
| 63 |
+
Đây là cách phù hợp nếu bạn muốn dùng và phát triển tiếp toàn bộ mã nguồn.
|
| 64 |
+
|
| 65 |
+
Tạo môi trường ảo:
|
| 66 |
+
|
| 67 |
+
```powershell
|
| 68 |
+
python -m venv venv
|
| 69 |
+
.\venv\Scripts\Activate.ps1
|
| 70 |
+
```
|
| 71 |
+
|
| 72 |
+
Cài dependencies:
|
| 73 |
+
|
| 74 |
+
```powershell
|
| 75 |
+
.\venv\Scripts\python -m pip install --upgrade pip
|
| 76 |
+
.\venv\Scripts\python -m pip install -r requirements.txt
|
| 77 |
+
```
|
| 78 |
+
|
| 79 |
+
Nếu dùng GPU NVIDIA, nên cài đúng `torch` theo CUDA của máy trước, rồi mới cài phần còn lại trong `requirements.txt`.
|
| 80 |
+
|
| 81 |
+
Đồng bộ model:
|
| 82 |
+
|
| 83 |
+
```powershell
|
| 84 |
+
.\venv\Scripts\python HVU_QA_tool.py --skip-run
|
| 85 |
+
```
|
| 86 |
+
|
| 87 |
+
Chạy app:
|
| 88 |
+
|
| 89 |
+
```powershell
|
| 90 |
+
.\venv\Scripts\python main.py
|
| 91 |
+
```
|
| 92 |
+
|
| 93 |
+
Mặc định ứng dụng chạy tại `http://127.0.0.1:5000`.
|
| 94 |
+
|
| 95 |
+
## 5. Cách dùng B - Standalone tool chỉ với một file
|
| 96 |
+
|
| 97 |
+
Đây là cách phù hợp cho **người dùng cuối** chỉ muốn chạy mô hình sinh câu hỏi mà không cần tải full source code.
|
| 98 |
+
|
| 99 |
+
Bạn chỉ cần:
|
| 100 |
+
|
| 101 |
+
- `HVU_QA_tool.py`
|
| 102 |
+
- hoặc `HVU_QA_tool.py` + `HVU_QA_tool.bat`
|
| 103 |
+
|
| 104 |
+
Đặt các file đó trong một thư mục trống, rồi chạy:
|
| 105 |
+
|
| 106 |
+
```powershell
|
| 107 |
+
python HVU_QA_tool.py
|
| 108 |
+
```
|
| 109 |
+
|
| 110 |
+
Hoặc trên Windows:
|
| 111 |
+
|
| 112 |
+
```text
|
| 113 |
+
double-click HVU_QA_tool.bat
|
| 114 |
+
```
|
| 115 |
+
|
| 116 |
+
Launcher sẽ tự động:
|
| 117 |
+
|
| 118 |
+
1. kiểm tra xem thư mục hiện tại có full project hay không
|
| 119 |
+
2. nếu không có full project, tự tạo `HVU_QA_runtime/`
|
| 120 |
+
3. tự tạo `.hvu_qa_tool_venv/` nếu máy chưa chạy trong virtualenv
|
| 121 |
+
4. tự cài dependency runtime còn thiếu
|
| 122 |
+
5. tải model từ Hugging Face
|
| 123 |
+
6. chạy ứng dụng web trong runtime vừa tạo
|
| 124 |
+
|
| 125 |
+
Nguồn model mặc định:
|
| 126 |
+
|
| 127 |
+
- Dataset repo: `DANGDOCAO/GeneratingQuestions`
|
| 128 |
+
- Revision: `main`
|
| 129 |
+
- Thư mục model từ repo: `HVU_QA/t5-viet-qg-finetuned/`
|
| 130 |
+
|
| 131 |
+
Launcher tự bỏ qua các checkpoint train để tránh tải dữ liệu không cần thiết cho người dùng cuối. Trong lúc đồng bộ model, launcher hiển thị:
|
| 132 |
+
|
| 133 |
+
- tiến độ theo từng file, ví dụ `[3/14] Đang tải ...`
|
| 134 |
+
- thanh progress tổng theo phần trăm
|
| 135 |
+
|
| 136 |
+
Lưu ý:
|
| 137 |
+
|
| 138 |
+
- lần đầu chạy cần có Internet
|
| 139 |
+
- nếu đang dùng standalone tool, launcher **không cần** `main.py`, `backend/`, `frontend/` hay `requirements.txt` có sẵn bên cạnh nó
|
| 140 |
+
- `--best-model-only` chỉ dùng được khi repo trên Hugging Face thật sự có thư mục `best-model`
|
| 141 |
+
|
| 142 |
+
Ví dụ:
|
| 143 |
+
|
| 144 |
+
```powershell
|
| 145 |
+
python HVU_QA_tool.py --device cpu
|
| 146 |
+
python HVU_QA_tool.py --host 127.0.0.1 --port 5000
|
| 147 |
+
python HVU_QA_tool.py --force-download
|
| 148 |
+
python HVU_QA_tool.py --skip-run
|
| 149 |
+
python HVU_QA_tool.py --no-browser
|
| 150 |
+
python HVU_QA_tool.py --prepare-runtime-only
|
| 151 |
+
python HVU_QA_tool.py --force-standalone-runtime
|
| 152 |
+
python HVU_QA_tool.py --runtime-dir MyRuntime
|
| 153 |
+
python HVU_QA_tool.py --no-venv
|
| 154 |
+
```
|
| 155 |
+
|
| 156 |
+
Các tuỳ chọn chính:
|
| 157 |
+
|
| 158 |
+
- `--repo-id`: đổi repo Hugging Face nếu cần
|
| 159 |
+
- `--revision`: chọn branch, tag hoặc commit hash
|
| 160 |
+
- `--device auto|cpu|cuda`: ép thiết bị chạy model
|
| 161 |
+
- `--force-download`: tải lại model và ghi đè file local
|
| 162 |
+
- `--skip-download`: bỏ qua bước tải model từ Hugging Face
|
| 163 |
+
- `--skip-install`: không tự cài dependency còn thiếu
|
| 164 |
+
- `--skip-run`: chỉ chuẩn bị model và môi trường, không mở app
|
| 165 |
+
- `--prepare-runtime-only`: chỉ dựng runtime, không tải model, không chạy app
|
| 166 |
+
- `--force-standalone-runtime`: ép launcher dùng runtime standalone kể cả khi đang đứng trong full project
|
| 167 |
+
- `--force-runtime-refresh`: ghi đè lại các file runtime nhúng trong launcher
|
| 168 |
+
- `--runtime-dir`: đổi thư mục runtime standalone
|
| 169 |
+
- `--no-venv`: không tự tạo virtualenv riêng cho launcher
|
| 170 |
+
|
| 171 |
+
## 6. Notebook hướng dẫn
|
| 172 |
+
|
| 173 |
+
Mở file:
|
| 174 |
+
|
| 175 |
+
```text
|
| 176 |
+
HVU_QA_end_to_end_guide.ipynb
|
| 177 |
+
```
|
| 178 |
+
|
| 179 |
+
Notebook hiện đã tách rõ:
|
| 180 |
+
|
| 181 |
+
- `Phần A - Full project`
|
| 182 |
+
- `Phần B - Chạy nhanh bằng tool`
|
| 183 |
+
|
| 184 |
+
Phần `Quick tool` trong notebook mô phỏng đúng trường hợp chỉ có `HVU_QA_tool.py` và `HVU_QA_tool.bat`.
|
| 185 |
+
|
| 186 |
+
## 7. Biến môi trường hữu ích
|
| 187 |
+
|
| 188 |
+
Bạn có thể cấu hình trước khi chạy:
|
| 189 |
+
|
| 190 |
+
```powershell
|
| 191 |
+
$env:HVU_HOST = "127.0.0.1"
|
| 192 |
+
$env:HVU_PORT = "5000"
|
| 193 |
+
$env:HVU_DEBUG = "0"
|
| 194 |
+
$env:HVU_OPEN_BROWSER = "1"
|
| 195 |
+
$env:HVU_MODEL_DIR = "t5-viet-qg-finetuned"
|
| 196 |
+
$env:HVU_TASK_PREFIX = "sinh câu hỏi"
|
| 197 |
+
$env:HVU_DEVICE = "auto" # auto | cpu | cuda
|
| 198 |
+
$env:HVU_CPU_THREADS = "8"
|
| 199 |
+
$env:HVU_GPU_DTYPE = "auto" # auto | float16 | bfloat16 | float32
|
| 200 |
+
$env:HVU_MAX_SOURCE_LENGTH = "512"
|
| 201 |
+
$env:HVU_MAX_NEW_TOKENS = "64"
|
| 202 |
+
.\venv\Scripts\python main.py
|
| 203 |
+
```
|
| 204 |
+
|
| 205 |
+
Ý nghĩa nhanh:
|
| 206 |
+
|
| 207 |
+
- `HVU_DEVICE=auto`: tự chọn `cuda` nếu có GPU, ngược lại dùng `cpu`
|
| 208 |
+
- `HVU_OPEN_BROWSER=0`: không tự mở trình duyệt
|
| 209 |
+
- `HVU_MODEL_DIR`: đổi thư mục model mặc định của backend
|
| 210 |
+
- `HVU_TASK_PREFIX`: đổi tiền tố prompt đưa vào model
|
| 211 |
+
- `HVU_CPU_THREADS`: giới hạn số luồng khi chạy CPU
|
| 212 |
+
|
| 213 |
+
## 8. Cách dùng giao diện full project
|
| 214 |
+
|
| 215 |
+
Sau khi mở web:
|
| 216 |
+
|
| 217 |
+
1. Nhập hoặc dán đoạn văn bản vào ô nhập.
|
| 218 |
+
2. Tăng `Số câu hỏi` từ `1` trở lên.
|
| 219 |
+
3. Nếu cần, chọn model ở thanh bên.
|
| 220 |
+
4. Nhấn `Sinh câu hỏi`.
|
| 221 |
+
|
| 222 |
+
Giao diện full project hiện có thêm:
|
| 223 |
+
|
| 224 |
+
- `Lịch sử`: lưu các lần sinh câu hỏi gần đây trong `localStorage`
|
| 225 |
+
- `Ví dụ mẫu`: chèn nhanh văn bản luật mẫu để thử
|
| 226 |
+
- `Trạng thái model`: hiển thị model đang dùng và thiết bị xử lý
|
| 227 |
+
- `Tác giả`: thông tin nhóm thực hiện
|
| 228 |
+
|
| 229 |
+
## 9. Nhập bằng giọng nói
|
| 230 |
+
|
| 231 |
+
Mic trên giao diện full project dùng `Web Speech API` của trình duyệt, không ghi file âm thanh cục bộ.
|
| 232 |
+
|
| 233 |
+
Lưu ý:
|
| 234 |
+
|
| 235 |
+
- nên dùng Chrome hoặc Edge
|
| 236 |
+
- chỉ hoạt động tốt trên `https://...` hoặc `http://localhost`
|
| 237 |
+
- cần cấp quyền micro cho trình duyệt
|
| 238 |
+
|
| 239 |
+
## 10. Model đang được hỗ trợ
|
| 240 |
+
|
| 241 |
+
Backend full project tự dò model khả dụng trong thư mục project.
|
| 242 |
+
|
| 243 |
+
Hiện logic đang hỗ trợ:
|
| 244 |
+
|
| 245 |
+
- thư mục model gốc có `config.json`
|
| 246 |
+
- model lồng bên trong `best-model/`
|
| 247 |
+
- model lồng bên trong `final-model/`
|
| 248 |
+
|
| 249 |
+
Ví dụ với project hiện tại, dropdown thường sẽ thấy:
|
| 250 |
+
|
| 251 |
+
- `t5-viet-qg-finetuned`
|
| 252 |
+
- `best-model`
|
| 253 |
+
|
| 254 |
+
## 11. Chạy bằng dòng lệnh
|
| 255 |
+
|
| 256 |
+
Bạn có thể sinh câu hỏi trực tiếp mà không cần mở web:
|
| 257 |
+
|
| 258 |
+
```powershell
|
| 259 |
+
.\venv\Scripts\python generate_question.py --text "Cơ sở giáo dục đại học có nhiệm vụ đào tạo, nghiên cứu khoa học và phục vụ cộng đồng." --num_questions 5 --output_format text
|
| 260 |
+
```
|
| 261 |
+
|
| 262 |
+
Đọc từ file:
|
| 263 |
+
|
| 264 |
+
```powershell
|
| 265 |
+
.\venv\Scripts\python generate_question.py --input_file input.txt --num_questions 5 --output_format json
|
| 266 |
+
```
|
| 267 |
+
|
| 268 |
+
Một số tham số thường dùng:
|
| 269 |
+
|
| 270 |
+
- `--model_dir`
|
| 271 |
+
- `--task_prefix`
|
| 272 |
+
- `--num_questions`
|
| 273 |
+
- `--device auto|cpu|cuda`
|
| 274 |
+
- `--cpu_threads`
|
| 275 |
+
- `--gpu_dtype auto|float16|bfloat16|float32`
|
| 276 |
+
- `--max_source_length`
|
| 277 |
+
- `--max_new_tokens`
|
| 278 |
+
|
| 279 |
+
## 12. API backend
|
| 280 |
+
|
| 281 |
+
### `GET /api/info`
|
| 282 |
+
|
| 283 |
+
Trả về:
|
| 284 |
+
|
| 285 |
+
- tiêu đề hệ thống
|
| 286 |
+
- model đang chọn
|
| 287 |
+
- danh sách model khả dụng
|
| 288 |
+
- trạng thái thiết bị và việc model đã được load hay chưa
|
| 289 |
+
|
| 290 |
+
### `POST /api/model`
|
| 291 |
+
|
| 292 |
+
Dùng để đổi model đang hoạt động.
|
| 293 |
+
|
| 294 |
+
Body mẫu:
|
| 295 |
+
|
| 296 |
+
```json
|
| 297 |
+
{
|
| 298 |
+
"model_id": "t5-viet-qg-finetuned/best-model"
|
| 299 |
+
}
|
| 300 |
+
```
|
| 301 |
+
|
| 302 |
+
### `POST /api/generate`
|
| 303 |
+
|
| 304 |
+
Dùng để sinh câu hỏi từ văn bản.
|
| 305 |
+
|
| 306 |
+
Body mẫu:
|
| 307 |
+
|
| 308 |
+
```json
|
| 309 |
+
{
|
| 310 |
+
"model_id": "t5-viet-qg-finetuned/best-model",
|
| 311 |
+
"text": "Cơ sở giáo dục đại học có nhiệm vụ tổ chức đào tạo, nghiên cứu khoa học và phục vụ cộng đồng.",
|
| 312 |
+
"num_questions": 5
|
| 313 |
+
}
|
| 314 |
+
```
|
| 315 |
+
|
| 316 |
+
Response thành công sẽ chứa:
|
| 317 |
+
|
| 318 |
+
- `ok`
|
| 319 |
+
- `text`
|
| 320 |
+
- `num_questions`
|
| 321 |
+
- `questions`
|
| 322 |
+
- `formatted`
|
| 323 |
+
- `elapsed_ms`
|
| 324 |
+
- `model_name`
|
| 325 |
+
- `selected_model_id`
|
| 326 |
+
- `meta`
|
| 327 |
+
|
| 328 |
+
## 13. Fine-tune model
|
| 329 |
+
|
| 330 |
+
Nếu muốn huấn luyện lại model, dùng `fine_tune_qg.py`.
|
| 331 |
+
|
| 332 |
+
Ví dụ:
|
| 333 |
+
|
| 334 |
+
```powershell
|
| 335 |
+
.\venv\Scripts\python fine_tune_qg.py --device cpu --output_dir t5-viet-qg-finetuned-cpu
|
| 336 |
+
```
|
| 337 |
+
|
| 338 |
+
Hoặc với GPU:
|
| 339 |
+
|
| 340 |
+
```powershell
|
| 341 |
+
.\venv\Scripts\python fine_tune_qg.py --device cuda --fp16 --gradient_checkpointing --output_dir t5-viet-qg-finetuned
|
| 342 |
+
```
|
| 343 |
+
|
| 344 |
+
Một số tham số nên biết thêm khi fine-tune:
|
| 345 |
+
|
| 346 |
+
- `--model_name`
|
| 347 |
+
- `--validation_file`
|
| 348 |
+
- `--resume_from_latest` hoặc `--resume_checkpoint`
|
| 349 |
+
- `--min_free_gpu_mb`
|
| 350 |
+
- `--skip_gpu_preflight`
|
| 351 |
+
- `--use_first_answer_only`
|
| 352 |
+
- `--require_answer_in_context`
|
| 353 |
+
|
| 354 |
+
## 14. Một số lỗi thường gặp
|
| 355 |
+
|
| 356 |
+
Thiếu thư viện Python:
|
| 357 |
+
|
| 358 |
+
- kích hoạt lại `venv`
|
| 359 |
+
- chạy lại `python -m pip install -r requirements.txt`
|
| 360 |
+
|
| 361 |
+
Không kích hoạt được `venv` trên PowerShell:
|
| 362 |
+
|
| 363 |
+
- chạy `Set-ExecutionPolicy -Scope Process Bypass`
|
| 364 |
+
- rồi chạy lại `.\venv\Scripts\Activate.ps1`
|
| 365 |
+
|
| 366 |
+
Không tải được model từ Hugging Face:
|
| 367 |
+
|
| 368 |
+
- kiểm tra lại Internet
|
| 369 |
+
- kiểm tra repo `DANGDOCAO/GeneratingQuestions` còn public
|
| 370 |
+
- thử chạy lại với `--force-download`
|
| 371 |
+
|
| 372 |
+
Không thấy model trong dropdown:
|
| 373 |
+
|
| 374 |
+
- kiểm tra thư mục `t5-viet-qg-finetuned/` có `config.json`
|
| 375 |
+
- kiểm tra lại cấu trúc `best-model/` hoặc `final-model/`
|
| 376 |
+
- restart server sau khi thêm model mới
|
| 377 |
+
|
| 378 |
+
Mic không hoạt động:
|
| 379 |
+
|
| 380 |
+
- dùng Chrome hoặc Edge
|
| 381 |
+
- cấp quyền micro
|
| 382 |
+
- chạy bằng `localhost` hoặc `https`
|
| 383 |
+
|
| 384 |
+
Sinh câu hỏi chậm:
|
| 385 |
+
|
| 386 |
+
- lần chạy đầu có thể chậm do model đang được load
|
| 387 |
+
- nếu máy không có GPU, app sẽ chạy bằng CPU
|
| 388 |
+
|
| 389 |
+
## 15. Ghi chú
|
| 390 |
+
|
| 391 |
+
- Lịch sử full project được lưu ở trình duyệt, không dùng cơ sở dữ liệu.
|
| 392 |
+
- Standalone tool dùng runtime tối thiểu để chạy nhanh cho người dùng cuối, không thay thế toàn bộ source code phát triển.
|
HVU_QA/requirements.txt
ADDED
|
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Runtime + training dependencies for HVU_QA.
|
| 2 |
+
# Nếu dùng NVIDIA GPU, hãy cài bản torch đúng với CUDA của máy theo hướng dẫn chính thức của PyTorch.
|
| 3 |
+
accelerate>=1.1.0,<2.0.0
|
| 4 |
+
datasets>=2.19.0,<4.0.0
|
| 5 |
+
Flask>=3.0.0,<4.0.0
|
| 6 |
+
huggingface_hub>=0.23.0,<1.0.0
|
| 7 |
+
numpy>=1.26.0,<3.0.0
|
| 8 |
+
safetensors>=0.4.3,<1.0.0
|
| 9 |
+
sentencepiece>=0.2.0,<1.0.0
|
| 10 |
+
torch>=2.2.0,<3.0.0
|
| 11 |
+
transformers>=4.41.0,<5.0.0
|