Back to Models
🧠

Mixtral 8x7B

Mistral AIopen-source

Efficient MoE model. 46.7B total / 12.9B active parameters.

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0
📝

No reviews yet

Be the first to review Mixtral 8x7B!

Model Info

ProviderMistral AI
Categoryopen-source
Total Reviews0
Avg. Rating0.0 / 5.0

Rating Guidelines

★★★★★Exceptional
★★★★Great
★★★Good
★★Fair
Poor