Mistral: Devstral Small

mistralai/devstral-small

Devstral-Small-2505 is a 24B parameter agentic LLM fine-tuned from Mistral-Small-3.1, jointly developed by Mistral AI and All Hands AI for advanced software engineering tasks. It is optimized for codebase exploration, multi-file editing, and integration into coding agents, achieving state-of-the-art results on SWE-Bench Verified (46.8%). Devstral supports a 128k context window and uses a custom Tekken tokenizer. It is text-only, with the vision encoder removed, and is suitable for local deployment on high-end consumer hardware (e.g., RTX 4090, 32GB RAM Macs). Devstral is best used in agentic workflows via the OpenHands scaffold and is compatible with inference frameworks like vLLM, Transformers, and Ollama. It is released under the Apache 2.0 license.

Pricing

Price per input token: $0.00000006

Price per output token: $0.00000012

Usage Example

NOTE: BrainLink is compatible with the OpenAI API, which allows you to use the OpenAI SDK even with non-OpenAI models

import OpenAI from "openai";
const userAccessToken = await BrainLink.getUserToken();
const openai = new OpenAI({
    baseURL: "https://www.brainlink.dev/api/v1",
    apiKey: userAccessToken,
});
const completion = await openai.chat.completions.create({
    model: "mistralai/devstral-small",
    messages: [
      { role: "user", content: "Hi! How are you today?" }
    ],
});