1.5 KiB
1.5 KiB
LM Studio Migration Notes
Summary
This repo originally translated subtitle chunks through a Google Translate scraper wired directly into src/engines.py. The translation backend is now replaced with a dedicated LM Studio client that talks to an OpenAI-compatible /v1/chat/completions endpoint.
New Runtime Defaults
LM_STUDIO_BASE_URL=http://127.0.0.1:1234/v1LM_STUDIO_API_KEY=lm-studioLM_STUDIO_MODEL=gemma-3-4b-it--translation-backend lmstudio
Commands Used In This Checkout
uv venv --clear --python "C:\pinokio\bin\miniconda\python.exe" .venv
uv pip install --python .venv\Scripts\python.exe -r requirements.txt pytest
Validation commands:
.venv\Scripts\python.exe -m pytest
.venv\Scripts\python.exe main.py --help
.venv\Scripts\python.exe -c "from src.translation import TranslationConfig, LMStudioTranslator; print(TranslationConfig.from_env().model)"
Files Touched
main.pyrequirements.txtREADME.mdsrc/engines.pysrc/translation.pytests/conftest.pytests/test_main_cli.pytests/test_translation.py
Notes
- Translation remains segment-by-segment for deterministic subtitle ordering.
- The CLI now supports
--lmstudio-base-urland--lmstudio-model. - Parser/help now loads before heavy runtime imports, which makes
main.py --helpmore reliable. src/googlev4.pywas removed from the active codebase because LM Studio is now the only supported translation backend.