Model Comparison

DeepSeek-R1

vs

GLM-4.5V

Feb 15, 2026

Pricing

Input

$

0.5

/ M Tokens

$

0.14

/ M Tokens

Output

$

2.18

/ M Tokens

$

0.86

/ M Tokens

Metadata

Create on

Jan 20, 2025

Aug 10, 2025

License

MIT

MIT

Provider

DeepSeek

Z.ai

Specification

State

Available

Available

Architecture

MoE

GLM-V family, based on GLM-4.5-Air, incorporates Chain-of-Thought reasoning, RLCS (Reinforcement Learning with Curriculum Sampling), and a Thinking Mode switch, with Mixture of Experts (MoE)

Calibrated

No

Yes

Mixture of Experts

Yes

Yes

Total Parameters

671B

106B

Activated Parameters

37B

12B

Reasoning

No

No

Precision

FP8

FP8

Context length

164K

66K

Max Tokens

164K

66K

Supported Functionality

Serverless

Supported

Supported

Serverless LoRA

Not supported

Not supported

Fine-tuning

Not supported

Not supported

Embeddings

Not supported

Not supported

Rerankers

Not supported

Not supported

Support image input

Not supported

Not supported

JSON Mode

Supported

Not supported

Structured Outputs

Not supported

Not supported

Tools

Supported

Supported

Fim Completion

Supported

Not supported

Chat Prefix Completion

Supported

Not supported

Ready to accelerate your AI development?

Ready to accelerate your AI development?

Ready to accelerate your AI development?

English

© 2025 SiliconFlow

English

© 2025 SiliconFlow

English

© 2025 SiliconFlow