ERNIE-4.5-300B-A47B

ERNIE-4.5-300B-A47B

baidu/ERNIE-4.5-300B-A47B

About ERNIE-4.5-300B-A47B

ERNIE-4.5-300B-A47B is a large language model developed by Baidu based on a Mixture-of-Experts (MoE) architecture. The model has a total of 300 billion parameters, but only activates 47 billion parameters per token during inference, thus balancing powerful performance with computational efficiency. As one of the core models in the ERNIE 4.5 series, it is trained on the PaddlePaddle deep learning framework and demonstrates outstanding capabilities in tasks such as text understanding, generation, reasoning, and coding. The model utilizes an innovative multimodal heterogeneous MoE pre-training method, which effectively enhances its overall abilities through joint training on text and visual modalities, showing prominent results in instruction following and world knowledge memorization. Baidu has open-sourced this model along with others in the series to promote the research and application of AI technology

Available Serverless

Run queries immediately, pay only for usage

$

0.28

/

$

1.1

Per 1M Tokens (input/output)

Metadata

Create on

Jul 2, 2025

License

Provider

BAIDU

Specification

State

Available

Architecture

Calibrated

No

Mixture of Experts

Yes

Total Parameters

-1

Activated Parameters

47 billion

Reasoning

No

Precision

FP8

Context length

131K

Max Tokens

131K

Supported Functionality

Serverless

Supported

Serverless LoRA

Not supported

Fine-tuning

Not supported

Embeddings

Not supported

Rerankers

Not supported

Support image input

Not supported

JSON Mode

Supported

Structured Outputs

Not supported

Tools

Not supported

Fim Completion

Not supported

Chat Prefix Completion

Not supported

Model FAQs: Usage, Deployment

Learn how to use, fine-tune, and deploy this model with ease.

Ready to accelerate your AI development?

Ready to accelerate your AI development?