Hunyuan-MT-7B Now On SiliconFlow: High-quality Translation Model Supporting 33 Languages
Sep 25, 2025
TL;DR: Hunyuan-MT-7B is here on SiliconFlow — Tencent's 7B-parameter model that redefines machine translation. Best-in-class at WMT25 (1st in 30/31 tasks), Hunyuan-MT-7B surpasses Google-Translator and other top translation-specialized models, covering 33 languages from Chinese, English, Japanese to Czech, Marathi, Estonian, and Icelandic.Try it now through our plug-and-play API — premium quality, free to use!
SiliconFlow is thrilled to bring Hunyuan-MT-7B to our model catalog — Tencent's first open-source multilingual translation model supporting bidirectional translation across 33 major languages. Despite being a 7B-parameter model, Hunyuan-MT exhibits impressive performance. It significantly outperforms many translation-specialized models and even state-of-the-art large models.
With SiliconFlow's Hunyuan-MT-7B API, you can expect:
Free to Use: Hunyuan-MT-7B is now available on SiliconFlow with no cost.
Comprehensive Language Support: Bidirectional translation across 33 languages.
Whether you're localizing digital content, enabling cross-border communication, or powering multilingual enterprise solutions, SiliconFlow's API brings Hunyuan-MT-7B into your workflow with dependable quality.
Why it is a Game Changer in Machine Translation
At ACL WMT2025, Hunyuan-MT-7B has achieved a significant achievement — first place in 30 of 31 language tracks, covering both high-resource languages such as Chinese, English, and Japanese, and lower-resource ones like Czech, Marathi, Estonian, and Icelandic.
What makes this achievement remarkable is that WMT25 imposes strict constraints: models must be open-source and trained only on publicly available data. Despite these limitations, Hunyuan-MT-7B consistently outperformed much larger models.

The FLORES-200 benchmark results confirm this advantage, with Hunyuan-MT-7B surpassing peer 7B–9B models and even rivaling larger-parameter models:
English⇔Multilingual: Achieves 91.1% (EN→XX) and 90.2% (XX→EN), ahead of translation-specialized models like Tower-Plus-9B (78.8% / 87.0%) and Google-Translator (76.4% / 77.6%), highlighting idiomatic fluency in English tasks.
Chinese⇔Multilingual: Scores 87.6% (ZH→XX) and 85.3% (XX→ZH), showing robust performance in both directions.
WMT24pp Leadership: With 85.8%, it exceeds Llama-4-Scout-17B-16E-Instruct (75.5%), proving a 7B model can rival much larger models.
Chinese Minority Language Support: Scores 60.8% on Minority⇔Mandarin translation tasks, clearly ahead of other models, underscoring leadership in minority and low-resource translation.
*Note: “XX” denotes other languages in the FLORES-200 dataset. Translation pairs are grouped as English⇔XX, Chinese⇔XX, and XX⇔XX.

The Training Paradigm Behind Its Success
This comprehensive leadership is the result of Tencent Hunyuan's training framework, designed specifically for multilingual translation:
Pretraining → CPT (Continual Pre-training) → Supervised Fine-Tuning → Translation RL → Integrated RL

This progressive process refines the model's general knowledge, adapts it to translation tasks, and aligns outputs through reinforcement learning.
Real Performance on SiliconFlow Playground
In these demos, you can see Hunyuan-MT-7B running on SiliconFlow translating seamlessly between English and Spanish. The responses are fast and the translations are natural and accurate.

Hunyuan-MT-7B can even handle English slang with ease, accurately translating “OG” as “más experimentado” (“the most experienced”).

Beyond English–Spanish, you can also explore other languages supported by Hunyuan-MT-7B in the chart below, making it well-suited for global communication, cross-border business, and multilingual applications.

Get Started Immediately
1. Explore: Try Hunyuan-MT-7B in the SiliconFlow playground.
2. Integrate: Use our OpenAI-compatible API. Explore the full API specifications in the SiliconFlow API documentation.
Start building with Hunyuan-MT-7B on SiliconFlow today — powerful, accurate, and free to use!
Join our Discord community now →