Meta AI, Meta’s AI-powered assistant across Facebook, Instagram, Messenger and the web, can now speak in more languages and create stylized selfies. And, starting today, Meta AI users can route questions to Meta’s newest flagship AI model, Llama 3.1 405B, which the company says can handle more complex queries than the previous model underpinning Meta AI.
The question is whether the enhancements will be enough to improve the overall Meta AI experience, which many reviewers, including TechCrunch’s Devin Coldewey, found incredibly underwhelming at launch. The early iterations of Meta AI struggled with facts, numbers and web search, often failing to complete basic tasks like looking up recipes and airfares.
Llama 3.1 405B could make a difference, potentially. Meta claims that the new model is particularly adept at math and coding questions, making it well-suited for help with math homework problems, explaining scientific concepts, code debugging and so on.
However, there’s a catch. Meta AI users have to manually switch to Llama 3.1 405B in order to use it, and they’re limited to a certain number of queries each week before Meta AI automatically switches over to a less-capable model (Llama 3.1 70B).
Image Credits: Meta
Meta’s labeling the Llama 3.1 405B integration as a “preview” for the time being.
Generative selfies
There’s another new generative AI model besides Llama 3.1 405B in Meta AI, and it powers the selfie feature.
Image Credits: Meta
The model, called Imagine Yourself, creates images based on a photo of a person and a prompt like “Imagine me surfing” or “Imagine me on a beach vacation.” Available in beta, Imagine Yourself can be invoked in Meta AI by typing “Imagine me” followed by anything that isn’t NSFW.