Grok-1: On March 17, Elon Musk‘s AI company, xAI, released the large language model (LLM) Grok-1 as open source. The billionaire declared last week that the AI chatbot will be open-sourced on his social media platform X, which was once known as Twitter. It is currently accessible to developers and researchers. Interestingly, the xAI developers said that only the pre-trained LLM has been made available to the general public. This implies that although Grok lacks training data, you may still build upon it using the weights and network design.
What is Grok-1?
“We are releasing the base model weights and network architecture of Grok-1, our large language model,” the company announced in a blog post. “Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.”
The model that is now accessible is “a raw base model checkpoint from the Grok-1 pre-training phase, which concluded in October 2023,” according to xAI. This effectively indicates that the model has not been adjusted for a particular use case.
Is it actually open source?
The business has chosen a more open-weight strategy rather than a fully open-sourcing grok. Unlike open-source, which is totally transparent, open-weight gives developers a pre-made framework to work with. Open-source models offer greater understanding and customising possibilities, but they also need a greater amount of work from developers.
Keep watching our YouTube Channel ‘DNP INDIA’. Also, please subscribe and follow us on FACEBOOK, INSTAGRAM, and TWITTER