GLM-5
About GLM-5
GLM-5 is a next-generation open-source model for complex systems engineering and long-horizon agentic tasks, scaled to ~744B sparse parameters (~40B active) with ~28.5T pretraining tokens. It integrates DeepSeek Sparse Attention (DSA) to retain long-context capacity while reducing inference cost, and leverages the “slime” asynchronous RL stack to deliver strong performance in reasoning, coding, and agentic benchmarks.
Available Serverless
Run queries immediately, pay only for usage
$
1.0
/
$
3.2
Per 1M Tokens (input/output)
Metadata
Specification
State
Available
Architecture
Mixture of Experts (MoE) with DeepSeek Sparse Attention (DSA) and asynchronous RL stack
Calibrated
No
Mixture of Experts
Yes
Total Parameters
750B
Activated Parameters
40B
Reasoning
No
Precision
FP8
Context length
205K
Max Tokens
131K
Supported Functionality
Serverless
Supported
Serverless LoRA
Not supported
Fine-tuning
Not supported
Embeddings
Not supported
Rerankers
Not supported
Support image input
Not supported
JSON Mode
Not supported
Structured Outputs
Not supported
Tools
Supported
Fim Completion
Not supported
Chat Prefix Completion
Not supported
Compare with Other Models
See how this model stacks up against others.

Z.ai
chat
GLM-5
Release on: Feb 12, 2026
Total Context:
205K
Max output:
131K
Input:
$
1.0
/ M Tokens
Output:
$
3.2
/ M Tokens

Z.ai
chat
GLM-4.7
Release on: Dec 23, 2025
Total Context:
205K
Max output:
205K
Input:
$
0.42
/ M Tokens
Output:
$
2.2
/ M Tokens

Z.ai
chat
GLM-4.6V
Release on: Dec 8, 2025
Total Context:
131K
Max output:
131K
Input:
$
0.3
/ M Tokens
Output:
$
0.9
/ M Tokens

Z.ai
chat
GLM-4.6
Release on: Oct 4, 2025
Total Context:
205K
Max output:
205K
Input:
$
0.39
/ M Tokens
Output:
$
1.9
/ M Tokens

Z.ai
chat
GLM-4.5-Air
Release on: Jul 28, 2025
Total Context:
131K
Max output:
131K
Input:
$
0.14
/ M Tokens
Output:
$
0.86
/ M Tokens

Z.ai
chat
GLM-4.5V
Release on: Aug 13, 2025
Total Context:
66K
Max output:
66K
Input:
$
0.14
/ M Tokens
Output:
$
0.86
/ M Tokens

Z.ai
chat
GLM-4.1V-9B-Thinking
Release on: Jul 4, 2025
Total Context:
66K
Max output:
66K
Input:
$
0.035
/ M Tokens
Output:
$
0.14
/ M Tokens

Z.ai
chat
GLM-Z1-32B-0414
Release on: Apr 18, 2025
Total Context:
131K
Max output:
131K
Input:
$
0.14
/ M Tokens
Output:
$
0.57
/ M Tokens

Z.ai
chat
GLM-4-32B-0414
Release on: Apr 18, 2025
Total Context:
33K
Max output:
33K
Input:
$
0.27
/ M Tokens
Output:
$
0.27
/ M Tokens
