# LM Studio Migration Notes ## Summary This repo originally translated subtitle chunks through a Google Translate scraper wired directly into `src/engines.py`. The translation backend is now replaced with a dedicated LM Studio client that talks to an OpenAI-compatible `/v1/chat/completions` endpoint. ## New Runtime Defaults - `LM_STUDIO_BASE_URL=http://127.0.0.1:1234/v1` - `LM_STUDIO_API_KEY=lm-studio` - `LM_STUDIO_MODEL=gemma-3-4b-it` - `--translation-backend lmstudio` ## Commands Used In This Checkout ```powershell uv venv --clear --python "C:\pinokio\bin\miniconda\python.exe" .venv uv pip install --python .venv\Scripts\python.exe -r requirements.txt pytest ``` Validation commands: ```powershell .venv\Scripts\python.exe -m pytest .venv\Scripts\python.exe main.py --help .venv\Scripts\python.exe -c "from src.translation import TranslationConfig, LMStudioTranslator; print(TranslationConfig.from_env().model)" ``` ## Files Touched - `main.py` - `requirements.txt` - `README.md` - `src/engines.py` - `src/translation.py` - `tests/conftest.py` - `tests/test_main_cli.py` - `tests/test_translation.py` ## Notes - Translation remains segment-by-segment for deterministic subtitle ordering. - The CLI now supports `--lmstudio-base-url` and `--lmstudio-model`. - Parser/help now loads before heavy runtime imports, which makes `main.py --help` more reliable. - `src/googlev4.py` was removed from the active codebase because LM Studio is now the only supported translation backend.