ModelBox Qwen2.5 Coder

ModelBox Qwen2.5 Coder

ModelBox now supports Qwen2.5 Coder Inference! With Qwen2.5-Coder-32B-Instruct setting the bar as the current SOTA open-source code model, it matches the coding power of GPT-4o. Start building smart today.

Qwen2.5 Coder Models

Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen). As of now, Qwen2.5-Coder has covered six mainstream model sizes, 0.5, 1.5, 3, 7, 14, 32 billion parameters, to meet the needs of different developers. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:

  • Significantly improvements in code generation, code reasoning and code fixing. Base on the strong Qwen2.5, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. Qwen2.5-Coder-32B has become the current state-of-the-art open-source codeLLM, with its coding abilities matching those of GPT-4o.
  • A more comprehensive foundation for real-world applications such as Code Agents. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies. Long-context Support up to 128K tokens.

Available at:

https://app.model.box/model/qwen/qwen2.5-coder-32b