The company, which may be seen as a great European hope for AI, has released several updates to AI assistant Le Chat. In addition to major web interface upgrades, the company is releasing mobile apps on iOS and Android.
As a reminder, Mistral develops its own large language model. The company’s flagship models, such as Mistral Large and Multimodal Model Pixtral Large, can be used for commercial use through APIs or cloud partners such as Azure AI Studio, Amazon Bedrock, and Google’s Vertex AI. It also releases many open weight models under the Apache 2.0 license.
Mistral wants to position itself as a reliable alternative to Openai or humanity. In addition to developing the Foundation model, LE Chat Assistant competes directly with ChatGpt, Claude, Google Gemini and Microsoft Copilot.
Mistral has finally released a mobile version of the assistant to better compete in the coveted locations on the home screen of your phone. The mobile app has a regular chatbot interface. You can query Mistral’s AI models and ask follow-up questions in a simple conversation-like interface.
Over the past few months, Le Chat has evolved to become a capable AI assistant. In November 2024, Mistral added support for web searching by citation. You can also generate images, interact with the freeform canvas and edit text or code in a separate window.
Recently, the company has signed a wide range of contracts with news agency Agence France-Presse (AFP) to base its results on reliable information procurement. However, Mistral’s mobile app for those who rely on speeches to query AI assistants does not have a voice mode.
![](https://techcrunch.com/wp-content/uploads/2025/02/Screenshot-2025-02-06-at-4.01.02PM.png?w=680)
With today’s update to Le Chat, Mistral is also introducing Protia in Europe for $14.99 per month or 14.99 euros per month. The company does not detail the exact AI models it uses under the hood, but it says ProPlan has access to “best performance models.” – Line model.
Other benefits include the increased overall limitations and the ability to opt out of data sharing with Mistral. The company is bringing a payment-based model to its AI assistants.
Up to 1,000 words per second
What makes Mistral stand out? The company does not claim that there are better models than competition. But it argues that from a product standpoint, it can sometimes be better than its competitors.
Mistral, for example, says Le Chat runs on “the fastest reasoning engine on the planet” and can answer up to 1,000 words per second. In our use it felt faster than using the 4O model of ChatGPT.
Mistral also claims to produce much better images than ChatGpt and Grok. The reason Mistral works well in this respect is that it relies on Flux Ultra from Black Forest Lab, one of the main image generation models.
These are great features, but one of the main highlights lies in Le Chat’s enterprise solutions. Mistral allows you to deploy LE chats in your environment using custom models and custom user interfaces.
If you are working in defense or banking, you may need to be able to deploy AI assistants on the premise. This is currently not possible with ChatGpt Enterprise or Claude Enterprise. That’s helpful when it comes to growing company revenue.
But first, let’s see if Mistral’s mobile app can convince some AI enthusiasts to try out chatbots. At the time of this writing, ChatGpt, Deepseek, and Google Gemini are No. 2 and No. 2 of the most downloaded iPhone apps in the US, respectively. Holds 3, and No.6 spots.
TechCrunch has a newsletter focused on AI! Sign up here to get it every Wednesday in your inbox.
Source link