Mistral

Devstral small 1.1

MistralSpecialized
Tool UseStructured Output

About this model

Devstral Small 1.1 is a 24B parameter open-weight language model for software engineering agents, developed by Mistral AI in collaboration with All Hands AI. Finetuned from Mistral Small 3.1 and released under the Apache 2.0 license, it features a 128k token context window and supports both Mistral-style function calling and XML output formats. Designed for agentic coding workflows, Devstral Small 1.1 is optimized for tasks such as codebase exploration, multi-file edits, and integration into autonomous development agents like OpenHands and Cline. It achieves 53.6% on SWE-Bench Verified, surpassing all other open models on this benchmark, while remaining lightweight enough to run on a single 4090 GPU or Apple silicon machine. The model uses a Tekken tokenizer with a 131k vocabulary and is deployable via vLLM, Transformers, Ollama, LM Studio, and other OpenAI-compatible runtimes.

Performance Tier

Specialized

Devstral small 1.1 is a specialized model from Mistral : built for a specific domain.

Domain-specific model. Optimized for a particular task such as code generation, image creation, or web search.

Pricing

This model is included in Elosia plans
Typeper 1M tokens
Input (prompt)$0.100
Output (completion)$0.300
Cache read$0.010

Capabilities

Context Length131K
Max Output Tokens
TokenizerMistral
Inputtext
Outputtext
Release DateJuly 10, 2025

Benchmarks

Programming
HumanEval
84%
SWE-bench Verified
40.2%

Recommended Use Cases

Coding

Strengths

  • Lightweight coding model for fast iteration
  • Open-weight and suitable for edge deployment
  • Good for inline code completion and small edits

Limitations

  • Limited reasoning for complex architectural decisions
  • Lower SWE-bench performance than larger alternatives

Resources

This model may use your data for training

Similar Models