The Baidu ERNIE team has announced the release of its ERNIE-4.5-300B-A47B, a powerful open-source large language model now available on the SiliconFlow platform.
Built on a Mixture-of-Experts (MoE) architecture, the model features 300B total parameters with 47B activated parameters per token. It delivers strong performance across multiple domains—including mathematical reasoning, accurate computation, and code generation—making it particularly well-suited for applications in mathematical computing and programming-related tasks.
SiliconFlow offers:
Inference Acceleration: Optimized for lower latency and higher throughput.
Extended Context: 128K token context window.
Cost-Optimized Pricing: $0.29/M tokens(input) and $1.15/M tokens (output).
Technical Highlights
ERNIE-4.5-300B-A47B's strong capabilities in instruction following and knowledge utilization in single-turn, multi-turn, and multilingual scenarios may be attributed to unified rewarding system, which incorporates carefully designed reward mechanisms to guide the model in better interpreting and adhering to diverse user instructions and internal knowledge.
Quick Start
Try the ERNIE-4.5-300B-A47B model directly on the SiliconFlow playground.
Quick Access to API
The following Python example demonstrates how to invoke the ERNIE-4.5-300B-A47B model using SiliconFlow's API endpoint. For more specifications, please refer to the SiliconFlow API documentation.
ERNIE-4.5-300B-A47B is an optimal choice for developers and researchers seeking advanced natural language understanding and generation capabilities. With its strong performance in generalization, reasoning, and coding tasks, the model is well-suited for building intelligent applications and exploring innovative use cases. It enables teams to rapidly deploy production-ready solutions with cutting-edge language capabilities.
Start building with ERNIE-4.5-300B-A47B today at SiliconFlow!