Alibaba logo

Qwen2.5-Coder 32B

AlibabaOpen WeightsPending Human Review

The flagship open-weight coding model of the Qwen 2.5 series. Trained on 5.5 trillion tokens of code, it matches the coding capabilities of GPT-4o in benchmarks like HumanEval and Aider. It supports over 92 programming languages.

2024-11-12
32B
Decoder-only Transformer
Apache 2.0

Specifications

Parameters
32B
Architecture
Decoder-only Transformer
License
Apache 2.0
Context Window
128,000 tokens
Max Output
8,192 tokens
Training Data Cutoff
Nov 2024
Type
text
Modalities
text

Benchmark Scores

Advanced Specifications

Model Family
Qwen
API Access
Available
Chat Interface
Available
Multilingual Support
Yes
Variants
Qwen2.5-Coder-32B-InstructQwen2.5-Coder-0.5BQwen2.5-Coder-1.5BQwen2.5-Coder-3BQwen2.5-Coder-7BQwen2.5-Coder-14B
Hardware Support
CUDAMetal

Capabilities & Limitations

Capabilities
code generationcode repaircode reasoningmultilingual coding
Notable Use Cases
Coding assistantCode reviewAutomated debugging
Function Calling Support
Yes
Tool Use Support
Yes

Related Models