Hunyuan-MT-7B

Hunyuan-MT-7B

tencent/Hunyuan-MT-7B

About Hunyuan-MT-7B

The Hunyuan Translation Model consists of a translation model, Hunyuan-MT-7B, and an ensemble model, Hunyuan-MT-Chimera. Hunyuan-MT-7B is a lightweight translation model with 7 billion parameters used to translate source text into the target language. The model supports mutual translation among 33 languages, including five ethnic minority languages in China. In the WMT25 machine translation competition, Hunyuan-MT-7B won first place in 30 out of the 31 language categories it participated in, demonstrating its outstanding translation capabilities. For translation tasks, Tencent Hunyuan proposed a comprehensive training framework covering pre-training, supervised fine-tuning, translation enhancement, and ensemble refinement, achieving state-of-the-art performance among models of a similar scale. The model is computationally efficient and easy to deploy, making it suitable for various application scenarios

Available Serverless

Run queries immediately, pay only for usage

$

0.0

/

$

0.0

Per 1M Tokens (input/output)

Metadata

Create on

Sep 18, 2025

License

Provider

Tencent

HuggingFace

Specification

State

Available

Architecture

Calibrated

No

Mixture of Experts

No

Total Parameters

7

Activated Parameters

7B

Reasoning

No

Precision

FP8

Context length

33K

Max Tokens

33K

Supported Functionality

Serverless

Supported

Serverless LoRA

Not supported

Fine-tuning

Not supported

Embeddings

Not supported

Rerankers

Not supported

Support image input

Not supported

JSON Mode

Supported

Structured Outputs

Not supported

Tools

Not supported

Fim Completion

Not supported

Chat Prefix Completion

Supported

Model FAQs: Usage, Deployment

Learn how to use, fine-tune, and deploy this model with ease.

Ready to accelerate your AI development?

Ready to accelerate your AI development?