The team detailed that the new model is “a magnitude more capable” than Grok 2, indicating Grok 3 has 10 to 15 times more ...
and adding that it has only been 17 months since the launch of the Grok 1 model. The team has tested Grok 3 against many scholastic challenges, including the American Invitational Mapping ...
It was originally built atop what has been called the Grok-1 model. Grok-1 was developed over the course of months on a cluster of “tens of thousands” of GPUs and leverages the Flux.
Elon Musk, in a live stream presentation with xAI engineers, claimed that Grok 3 has “more than 10 times” the compute power ...
OpenAI researchers accused xAI about publishing misleading Grok 3 benchmarks. The truth is a little more nuanced.
On March 11, 2024, Musk posted on X that the language model would go open source within a week and six days later, on March 17, Grok-1 was open sourced under the Apache-2.0 license. Elon Musk ...