Mistral: Mixtral 8x7B (base)

mistralai/mixtral-8x7b

Mixtral 8x7B is a pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see [Mixtral 8x7B Instruct](/models/mistralai/mixtral-8x7b-instruct) for an instruct-tuned model. #moe

Pricing

Price per input token: $0.0000006

Price per output token: $0.0000006

Usage Example

NOTE: BrainLink is compatible with the OpenAI API, which allows you to use the OpenAI SDK even with non-OpenAI models

import OpenAI from "openai";
const userAccessToken = await BrainLink.getUserToken();
const openai = new OpenAI({
    baseURL: "https://www.brainlink.dev/api/v1",
    apiKey: userAccessToken,
});
const completion = await openai.chat.completions.create({
    model: "mistralai/mixtral-8x7b",
    messages: [
      { role: "user", content: "Hi! How are you today?" }
    ],
});