Mistral: Mixtral 8x7B (base)
mistralai/mixtral-8x7b
About Mistral: Mixtral 8x7B (base)
Mixtral 8x7B is a pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see Mixtral 8x7B Instruct for an instruct-tuned model.
#moe
Specifications
Context Length
32,768
Tokenizer
Mistral
Pricing
Prompt
0.600
Completion
0.600
Image
0
Request
0
Last updated: 4/10/2025