Hunyuan-MT-7B
About Hunyuan-MT-7B
The Hunyuan Translation Model consists of a translation model, Hunyuan-MT-7B, and an ensemble model, Hunyuan-MT-Chimera. Hunyuan-MT-7B is a lightweight translation model with 7 billion parameters used to translate source text into the target language. The model supports mutual translation among 33 languages, including five ethnic minority languages in China. In the WMT25 machine translation competition, Hunyuan-MT-7B won first place in 30 out of the 31 language categories it participated in, demonstrating its outstanding translation capabilities. For translation tasks, Tencent Hunyuan proposed a comprehensive training framework covering pre-training, supervised fine-tuning, translation enhancement, and ensemble refinement, achieving state-of-the-art performance among models of a similar scale. The model is computationally efficient and easy to deploy, making it suitable for various application scenarios
Available Serverless
Run queries immediately, pay only for usage
$
0.0
/
$
0.0
Per 1M Tokens (input/output)
Metadata
Specification
State
Available
Architecture
Calibrated
No
Mixture of Experts
No
Total Parameters
7B
Activated Parameters
7B
Reasoning
No
Precision
FP8
Context length
33K
Max Tokens
33K
Supported Functionality
Serverless
Supported
Serverless LoRA
Not supported
Fine-tuning
Not supported
Embeddings
Not supported
Rerankers
Not supported
Support image input
Not supported
JSON Mode
Supported
Structured Outputs
Not supported
Tools
Not supported
Fim Completion
Not supported
Chat Prefix Completion
Supported
SiliconFlow Service
Comprehensive solutions to deploy and scale your AI applications with maximum flexibility
60%
lower latency
2x
higher throughput
65%
cost savings
Compare with Other Models
See how this model stacks up against others.
Model FAQs: Usage, Deployment
Learn how to use, fine-tune, and deploy this model with ease.

