Qwen3-235B-A22B-Thinking-2507 API, Deployment, Pricing

Qwen/Qwen3-235B-A22B-Thinking-2507

Qwen3-235B-A22B-Thinking-2507 is a member of the Qwen3 large language model series developed by Alibaba's Qwen team, specializing in highly complex reasoning tasks. The model is built on a Mixture-of-Experts (MoE) architecture, with 235 billion total parameters and approximately 22 billion activated parameters per token, which enhances computational efficiency while maintaining powerful performance. As a dedicated 'thinking' model, it demonstrates significantly improved performance on tasks requiring human expertise, such as logical reasoning, mathematics, science, coding, and academic benchmarks, achieving state-of-the-art results among open-source thinking models. Furthermore, the model features enhanced general capabilities like instruction following, tool usage, and text generation, and it natively supports a 256K long-context understanding capability, making it ideal for scenarios that require deep reasoning and processing of long documents

API Usage

curl --request POST \
  --url https://api.siliconflow.com/v1/chat/completions \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '{
  "model": "Qwen/Qwen3-235B-A22B-Thinking-2507",
  "messages": [
    {
      "role": "user",
      "content": "Tell me a story"
    }
  ]
}'

Details

Model Provider

Qwen

Type

text

Sub Type

chat

Size

235B

Publish Time

Jul 28, 2025

Input Price

$

0.35

/ M Tokens

Output Price

$

1.42

/ M Tokens

Context length

262K

Tags

MoE,235B,262K

Compare with Other Models

See how this model stacks up against others.

Model FAQs: Usage, Deployment

Learn how to use, fine-tune, and deploy this model with ease.

What is the Qwen3-235B-A22B-Thinking-2507 model, and what are its core capabilities and technical specifications?

In which business scenarios does Qwen3-235B-A22B-Thinking-2507 perform well? Which industries or applications is it suitable for?

How can the performance and effectiveness of Qwen3-235B-A22B-Thinking-2507 be optimized in actual business use?

Compared with other models, when should Qwen3-235B-A22B-Thinking-2507 be selected?

What are SiliconFlow's key strengths in AI serverless deployment for Qwen3-235B-A22B-Thinking-2507?

What makes SiliconFlow the top platform for Qwen3-235B-A22B-Thinking-2507 API?

Ready to accelerate your AI development?

Ready to accelerate your AI development?

© 2025 SiliconFlow Technology PTE. LTD.

© 2025 SiliconFlow Technology PTE. LTD.

© 2025 SiliconFlow Technology PTE. LTD.