Meta is shaking up the AI scene with its latest marvel, the Llama 3.3 70B model.
This new addition to the Llama family promises to deliver the same punch as its larger sibling, the Llama 3.1 405B, but at a fraction of the cost. Thanks to cutting-edge post-training techniques, Meta's VP of generative AI, Ahmad Al-Dahle, assures us that this model is both a powerhouse and a penny-saver.
In a head-to-head showdown, Llama 3.3 70B outstrips competitors like Google’s Gemini 1.5 Pro and OpenAI’s GPT-4o on various benchmarks, including the MMLU, which tests language understanding. This model is set to enhance skills in math, general knowledge, and more, making it a versatile tool for developers.
Available for download on platforms like Hugging Face, Llama 3.3 70B is part of Meta’s strategy to lead the AI world with models that are open for a range of applications. However, there are some strings attached; developers with massive platforms need a special license to use Llama models.
Want to hear more? Join Mal on the Property AI Report Podcast each week!
Access from your preferred podcast provider by clicking here
Despite some regulatory hurdles, including compliance with the EU's AI Act and GDPR, Meta is pushing forward. The company is building a colossal $10 billion AI data centre in Louisiana to support future Llama models, requiring a whopping 10x more computing power than its predecessors.
With over 650 million downloads and Meta AI boasting nearly 600 million monthly users, Llama is making waves. As Meta scales up its infrastructure with a fleet of Nvidia GPUs, the future of Llama looks bright and bustling.
Want to hear more? Join Mal on the Property AI Report Podcast each week!
Access from your preferred podcast provider by clicking here
Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai
コメント