Elon Musk unveils Grok, xAI's advanced LLM, to rival OpenAI's GPT and Anthropic's Claude 2

Elon Musk unveils Grok, xAI's advanced LLM, to rival OpenAI's GPT and Anthropic's Claude 2

Elon Musk has unveiled xAI's leading AI product, "Grok," a Large Language Model (LLM) that offers real-time data, efficiency, and "humor". The name "Grok" implies understanding and puts it in direct competition with other LLMs such as OpenAI's GPT and Anthropic's Claude 2. For now, Grok is accessible to a select group of users in the United States, with a waitlist for early access through the X social network. The pricing details are yet to be announced.

Grok-0, the initial prototype, was developed using 33 billion data parameters, which is more than the 20 billion parameters used by OpenAI's GPT-3.5, but less than newer models like Meta's Meta Llama, which uses 70 billion. Despite this, xAI claims that Grok-0 equals Meta LLama 2's capabilities with just half the training resources.

The xAI team's advancements in reasoning and coding led to the creation of Grok-1, the advanced LLM that drives the Grok chatbot client. In its computing class, Grok surpasses other models like ChatGPT -3.5 and Inflection-1 in machine learning benchmarks and tasks. Future plans include making Grok available to all X Premium+ subscribers, although the timeline has not been disclosed.

by Mauricio B. Holguin

du
duttyend found this interesting
Grok iconGrok
  46
  • ...

Grok is an LLM system developed by xAI, with inspiration drawn from the "Hitchhiker's Guide to the Galaxy." As an advanced AI, Grok is designed to simulate human-like conversation and interaction. Top alternatives to Grok include ChatGPT, HuggingChat, and Google Bard, each offering their own unique functionalities in the realm of AI chat systems.

Comments

samitr
0

Gpt 3.5 uses 100 million parameters, and Gpt-4 uses 1 trillon.

3 replies
samitr

100 billion for Gpt 3.5 ** and more precisely 175 billion

Mauricio B. Holguin

To be fair, the thing with GPT-3.5 is that OpenAI hasn't provided an official statement, as far as I know. And recently a Microsoft codediffusion paper suggests it actually has 20 billion parameters, though this is still being debated:

https://www.reddit.com/r/LocalLLaMA/comments/17jrj82/new_microsoft_codediffusion_paper_suggests_gpt35/

https://www.reddit.com/r/LocalLLaMA/comments/17lvquz/clearing_up_confusion_gpt_35turbo_may_not_be_20b/

samitr

Thank you very much it’s good to know!

Review by a new / low-activity user.
dylwintftw
0

How is 33 billion less than 20 billion

Review by a new / low-activity user.
BarnMTB
0

Love that he poked fun at "Sorry, I’m a language model and can’t answer that" that current AI love to recite.

It's pretty annoying to be hit with a brick wall, even with a pretty safe questions.

Gu