GLM-Z1-9B-0414

GLM-Z1-9B-0414

THUDM/GLM-Z1-9B-0414

About GLM-Z1-9B-0414

GLM-Z1-9B-0414 is a small-sized model in the GLM series with only 9 billion parameters that maintains the open-source tradition while showcasing surprising capabilities. Despite its smaller scale, GLM-Z1-9B-0414 still exhibits excellent performance in mathematical reasoning and general tasks. Its overall performance is already at a leading level among open-source models of the same size. The research team employed the same series of techniques used for larger models to train this 9B model. Especially in resource-constrained scenarios, this model achieves an excellent balance between efficiency and effectiveness, providing a powerful option for users seeking lightweight deployment. The model features deep thinking capabilities and can handle long contexts through YaRN technology, making it particularly suitable for applications requiring mathematical reasoning abilities with limited computational resources

Available Serverless

Run queries immediately, pay only for usage

$

0.086

/

$

0.086

Per 1M Tokens (input/output)

Metadata

Create on

Apr 18, 2025

License

mit

Provider

Z.ai

HuggingFace

Specification

State

Available

Architecture

Calibrated

Yes

Mixture of Experts

No

Total Parameters

9

Activated Parameters

9 billion

Reasoning

No

Precision

FP8

Context length

131K

Max Tokens

131K

Supported Functionality

Serverless

Supported

Serverless LoRA

Not supported

Fine-tuning

Not supported

Embeddings

Not supported

Rerankers

Not supported

Support image input

Not supported

JSON Mode

Supported

Structured Outputs

Not supported

Tools

Supported

Fim Completion

Not supported

Chat Prefix Completion

Not supported

Model FAQs: Usage, Deployment

Learn how to use, fine-tune, and deploy this model with ease.

Ready to accelerate your AI development?

Ready to accelerate your AI development?