laitimes

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

author:New Zhiyuan

Editor: Editorial Department

The start-up team Mistral AI once again released a magnetic chain, and the 281GB file lifted the ban on the latest 8x22B MoE model.

A magnetic chain, Mistral AI is here again to do things quietly.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

In the 281.24GB file, it turns out to be a new 8x22B MOE model!

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

The new MoE model has a total of 56 layers, 48 attention heads, 8 experts, and 2 active experts.

Moreover, the context length is 65k.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

Netizens have said that Mistral AI, as always, has set off an AI community boom by relying on a magnetic chain.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

In this regard, Jia Yangqing also said that he can't wait to see the detailed comparison between it and other SOTA models!

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

Relying on the magnetic chain to spread the entire AI community

In December last year, after the release of the first Magnet Chain, the 8x7B MoE model released by Mistral AI received a lot of praise.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

In the benchmark test, the performance of the eight 7 billion parameter small models outperformed Llama 2, which has up to 70 billion parameters.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

It handles 32k contexts well, supports English, French, Italian, German, and Spanish, and shows strong performance in code generation.

In February this year, the latest flagship model, Mistral Large, was launched, and its performance is directly comparable to GPT-4.

However, this version of the model is not open source.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

Mistral Large has excellent logical reasoning capabilities and is capable of handling complex multilingual tasks including text understanding, conversion, and code generation.

That is, half a month ago, at a Cerebral Valley hackathon event, Mistral AI open-sourced the Mistral 7B v0.2 base model.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

This model supports 32k contexts, no sliding window, Rope Theta = 1e6.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

Now, the latest 8x22B MoE model is also available on the Hug Face platform, and community members can build their own applications based on it.

Just now, Mistral AI's latest magnetic chain has been released!8x22B MoE model, 281GB unbanned

Read on