Mistral

Nemo

MistralCompact
Tool UseStructured Output

About this model

A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese,...

Performance Tier

Compact

Nemo is a compact model from Mistral : optimized for speed and affordability.

Small, fast, and affordable. Optimized for speed and low cost, great for high-volume or simple tasks.

Pricing

This model is included in Elosia plans
Typeper 1M tokens
Input (prompt)$0.020
Output (completion)$0.040

Capabilities

Context Length131K
Max Output Tokens16K
TokenizerMistral
Inputtext
Outputtext
Release DateJuly 19, 2024

Benchmarks

General Intelligence
MMLU
68%
Mathematics
MATH-500
55%
Programming
HumanEval
68.5%

Recommended Use Cases

General ChatSummarizationTranslation

Strengths

  • Compact 12B model suitable for edge deployment
  • Good multilingual capabilities
  • Open-weight with Apache 2.0 license
  • EU-based company — data sovereignty advantage

Limitations

  • Limited performance due to small model size
  • Outdated compared to newer Mistral models

Resources

This model may use your data for training

Similar Models