ERNIE-4.5-300B-A47B
baidu/ERNIE-4.5-300B-A47B
ERNIE-4.5-300B-A47B is a large language model developed by Baidu based on a Mixture-of-Experts (MoE) architecture. The model has a total of 300 billion parameters, but only activates 47 billion parameters per token during inference, thus balancing powerful performance with computational efficiency. As one of the core models in the ERNIE 4.5 series, it is trained on the PaddlePaddle deep learning framework and demonstrates outstanding capabilities in tasks such as text understanding, generation, reasoning, and coding. The model utilizes an innovative multimodal heterogeneous MoE pre-training method, which effectively enhances its overall abilities through joint training on text and visual modalities, showing prominent results in instruction following and world knowledge memorization. Baidu has open-sourced this model along with others in the series to promote the research and application of AI technology
Details
Model Provider
baidu
Type
text
Sub Type
chat
Size
MOE
Publish Time
Jul 2, 2025
Input Price
$
0.29
/ M Tokens
Output Price
$
1.15
/ M Tokens
Context length
128K
Tags
MoE,300B,128K