Elon Musk launches Grok-1 AI to counter ChatGPT
Grok-1 is still in early development but it shows much promise
2 min. read
Published on
Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more
Elon Musk’s X launched their first AI named Grok-1 based on a 314 billion parameter Mixture-of-Experts model, as the publisher explained on a blog post.
For those who don’t know yet, Musk has chosen this name as a reference to Hitchhiker’s Guide to the Galaxy.
Previously, Grok was only available to Premium X users, but now, Musk has kept his promise to launch an open source version.
How can I get Grok-1?
Before anything, don’t get to excited because the new X model is not yet tuned up for optimal usage.
This is the raw base model checkpoint from the Grok-1 pre-training phase, which concluded in October 2023. This means that the model is not fine-tuned for any specific application, such as dialogue.
X.ai blog
Anyway, if you want to get the code and play with it, you can find it on the Grok-1 GitHub page. It is released under the Apache 2.0 license so you can mod it freely.
However, there is not so much information about Grok-1. The folks from X only disclosed a few hints:
X.ai blog
- Base model trained on a large amount of text data, not fine-tuned for any particular task.
- 314B parameter Mixture-of-Experts model with 25% of the weights active on a given token.
- Trained from scratch by xAI using a custom training stack on top of JAX and Rust in October 2023.
However, interestingly enough, the cover of the post, which we also displayed on top, was created with Midjourney based on a prompt proposed by Grok: A 3D illustration of a neural network, with transparent nodes and glowing connections, showcasing the varying weights as different thicknesses and colors of the connecting lines.
If you decide on trying the new model, tell us all about it in the comments section below.
User forum
0 messages