Home

Popular

Latest

Explore


Claim Brand Profile



Powered by

D

@desaiankitb

64w ago

[The release of Grok-1's base model weights and architecture, a 314 billion parameter Mixture-of-Experts model, is announced.]

Posted in

AE

AIML explained by Ankit

on BeGenuin

  • Large Language Models

    View Group

Comments (1)

Add a Comment