Mistral AI logo

Ministral 8B

Mistral AIProprietaryVerified

A state-of-the-art 8B parameter model from Mistral AI's Ministral family, part of 'les Ministraux'. Features special interleaved sliding-window attention pattern for faster and memory-efficient inference, designed for edge computing and on-device applications.

2024-10-16
8B
Decoder-only Transformer with Interleaved Sliding-Window Attention
Mistral Commercial License

Specifications

Parameters
8B
Architecture
Decoder-only Transformer with Interleaved Sliding-Window Attention
License
Mistral Commercial License
Context Window
128,000 tokens
Max Output
32,000 tokens
Type
text
Modalities
text

Benchmark Scores

The FACTS Grounding Leaderboard evaluates LLMs' ability to generate factually accurate long-form res...

Advanced Specifications

Model Family
Ministral
API Access
Available
Chat Interface
Not Available
Hardware Support
edge-devicesmobile

Capabilities & Limitations

Capabilities
function-callingreasoningedge-computinglow-latency-inferencememory-efficient-inference
Notable Use Cases
on-device translationinternet-less smart assistantslocal analyticsautonomous roboticsagentic workflowsinput parsingtask routingmulti-step workflows
Function Calling Support
Yes

Related Models