watsonx.ai

 View Only

Breaking News: Llama 3.1 is here! 🚀🦙

By MARYAM ASHOORI posted 3 days ago

  


I’m thrilled to announce that Llama 3.1 405B is coming to watsonx.ai today! 🎉

Image preview


This model update is packed with exciting features and improvements, including:

🤯 Pre-trained and instruction-tuned generative models in 8B, 70B, and 405B sizes
🌎 Multilingual support for German, French, Italian, Portuguese, Hindi, Spanish and Thai
⏱️ Optimized for inference with grouped query attention (GQA) to speed up the inference time
💬 Instruction-tuned variants optimized for multilingual dialogue
🔥 Trained using 16,000 Nvidia H100 GPUs
🔧 Fine-tuned for agentic tool use (e.g., search, image generation, code execution, and mathematical reasoning)
📕 Text-only, supporting most text-based workloads like analyzing PDFs

The 405B model is the largest and most powerful open-source model available on watsonx.ai, making it perfect for high accuracy inference, synthetic data generation, and as a teacher model to fine-tune other models. 🚀

Check out the official release announcements:
Meta: https://lnkd.in/eFd-zS8C
IBM: https://lnkd.in/eqjYV7Ku

Tutorials:
1️⃣ No-code RAG Tutorial for llama 3.1 405B https://lnkd.in/ezYUhQUg
2️⃣ Langchain RAG application for PDF documents using Llama 3.1-405b https://lnkd.in/eAS6QvkR
3️⃣ LangChain RAG system for web data in Python using Llama 3.1-405b https://lnkd.in/eygtYEc4

(Note: this post was written with the assistance of Llama 3.1 405B on watsonx.ai!)


#watsonx.ai
#GenerativeAI
0 comments
9 views

Permalink