Back to all posts

The Best LM Studio Models for Translation and Transcription (2025)

October 25, 2025 - 5 min read - Raymond

studio1.usllmArtificial IntelligencetranslationtoolsAndroidLocal LLM
The Best LM Studio Models for Translation and Transcription (2025)

LM Studio has fundamentally changed the game for AI enthusiasts, allowing us to run powerful, uncensored large language models (LLMs) on our own hardware. While general-purpose chat is its most common use, many users overlook its incredible potential for specialized tasks like language translation and audio transcription.

Running these models locally means your data—be it a private document for translation or a sensitive meeting for transcription—never leaves your computer. But which models should you download?

Here’s my breakdown of the best models available in LM Studio for these two critical tasks, and how you can take it a step further by accessing them from anywhere in your house using your Android phone.


🎧 Best Models for Audio Transcription

This is perhaps the most straightforward category. When it comes to open-source, high-performance audio transcription, one name reigns supreme: OpenAI's Whisper.

Whisper is a highly specialized model trained exclusively on a massive dataset of audio and corresponding text. It's not a conversational LLM; it's a dedicated tool for turning speech into text, and it does so with near-human-level accuracy.

The best part? The community has created GGUF versions of Whisper, making them perfectly compatible with LM Studio.

How to Find Whisper in LM Studio:

  1. Go to the Search tab (magnifying glass icon) in LM Studio.

  2. In the search bar, type whisper GGUF.

  3. You will see several options. I recommend looking for the models from reputable creators like ggerganov (the original creator of llama.cpp) or TheBloke.

  • whisper-large-v3-GGUF: This is the most powerful and accurate model. It has exceptional multilingual capabilities and can handle heavy accents and noisy backgrounds with impressive resilience. This is my top recommendation if your hardware can handle it.

  • whisper-medium-GGUF: If you find the large model is too slow or consumes too much VRAM, the medium model is the perfect sweet spot. It offers a fantastic balance of speed and accuracy and is more than capable for most transcription needs.

Note: To use Whisper models, you'll load them on the Server tab (<-> icon) in LM Studio. They are compatible with the OpenAI API endpoint for audio transcription, which is how most client applications will interact with them.


🌍 Best Models for Language Translation

Unlike transcription, there isn't one single "best" model for translation. Instead, you have a choice between large, highly capable general-purpose models and smaller, specialized ones.

1. The All-Rounder (High Performance): Qwen 1.5 & DeepSeek

The latest top-tier multilingual models are phenomenal at translation. They understand context, nuance, and idioms in a way that older, dedicated translation models can't.

  • Qwen1.5-72B-Chat-GGUF: The Qwen family of models is renowned for its powerful multilingual understanding. The 72-billion parameter model is a heavyweight champion that can provide incredibly nuanced and accurate translations. It's especially strong with Chinese and English but performs exceptionally well across many language pairs. Be warned: it requires significant VRAM (48GB+).

  • DeepSeek-LLM-67B-Chat-GGUF: Another excellent high-performance option, DeepSeek's model is a strong competitor to Qwen. It excels at reasoning and following complex instructions, which translates to high-quality, context-aware translations.

2. The Balanced Choice (Efficiency & Quality): Mistral & Llama 3

For most users, a massive 70B+ model is overkill. These smaller, instruct-tuned models offer fantastic quality and will run on a much wider range of hardware.

  • Mistral-7B-Instruct-v0.3-GGUF: This is my go-to recommendation for almost everyone. The Mistral 7B model is famously lightweight (runs on as little as 16GB of RAM/VRAM) and provides performance that punches far above its weight class. Its instruction-following is top-notch, making it easy to command: "Translate the following text from English to French:"

  • Meta-Llama-3-8B-Instruct-GGUF: The other king of the "small" models. Llama 3 has excellent multilingual capabilities and is a very well-rounded and balanced choice for translation and general chat.

3. The Specialist: ALMA

  • ALMA-13B-GGUF: If your only goal is translation, keep an eye on the ALMA models. ALMA (Advanced Language Model-based trAnslator) is a family of models fine-tuned specifically on translation tasks. While the general-purpose models above are often "good enough," a specialized model like ALMA can sometimes provide more precise and literal translations.

📱 How to Use These Models on Your Android Phone

Running these models on your PC is great, but what about when you're on the couch or in another room? You don't want to be chained to your desk.

This is where LMSA (LM Studio Android) comes in. It's an open-source app that transforms your Android phone into a slick, native client for your LM Studio server.

LMSA connects to your PC over your local Wi-Fi, allowing you to chat with any model you have loaded in LM Studio. You get the full power of your desktop's GPU, all from the convenience of your phone.

How to Get Started with LMSA:

First, make sure you've started the server in your desktop LM Studio app (go to the <-> tab, load a model, and click "Start Server"). Don't forget to enable "Serve over local network" and "Enable CORS" in the server settings!

You then have two easy options to get the app:

  1. Recommended (Google Play Store): The easiest and safest method. This ensures you get automatic updates and a stable, verified build.

  2. Manual (For Power Users): If you prefer to host it yourself or want to try the latest unreleased features, you can clone the project from GitHub and run it as a simple web server.

Once the app is running, just plug in your PC's local IP address and port (shown on the LM Studio server page), and you'll be chatting with your local translation or chat model in seconds.

I use LMSA every day. It's the perfect companion to LM Studio and truly untethers the power of local LLMs.

Happy translating and transcribing!

-Ray