GPT-4: 8 Models in One; The Secret is Out

<p>The GPT4 model has been THE groundbreaking model so far, available to the general public either for free or through their commercial portal (for public beta use). It has worked wonders in igniting new project ideas and use-cases for many entrepreneurs but the secrecy about the number of parameters and the model was killing all enthusiasts who were betting on the first 1 trillion parameter model to 100 trillion parameter claims!</p> <h1>The cat is out of the bag</h1> <p>Well, the cat is out of the bag (Sort of). On June 20th,&nbsp;<a href="https://twitter.com/swyx/status/1671272883379908608" rel="noopener ugc nofollow" target="_blank">George Hotz</a>, founder of self-driving startup Comma.ai leaked that GPT-4 isn&rsquo;t a single monolithic dense model (like GPT-3 and GPT-3.5) but a mixture of 8 x 220-billion-parameter models.</p> <p><iframe frameborder="0" height="1386" scrolling="no" src="https://cdn.embedly.com/widgets/media.html?type=text%2Fhtml&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;schema=twitter&amp;url=https%3A//twitter.com/swyx/status/1671272883379908608&amp;image=https%3A//i.embed.ly/1/image%3Furl%3Dhttps%253A%252F%252Fabs.twimg.com%252Ferrors%252Flogo46x38.png%26key%3Da19fcc184b9711e1b4764040d3dc5c07" title="swyx.ai on Twitter: &quot;GPT4 is 8 x 220B params = 1.7 Trillion params https://t.co/DW4jrzFEn2ok I wasn't sure how widely to spread the rumors on GPT-4 but it seems Soumith is also confirming the same so here's the quick clip! so yes, GPT4 is technically 10x the size of GPT3, and all the small... pic.twitter.com/m2YiaHGVs4 / Twitter&quot;" width="680"></iframe></p> <p>Later that day,&nbsp;<a href="https://twitter.com/soumithchintala/status/1671267150101721090" rel="noopener ugc nofollow" target="_blank">Soumith Chintala</a>, co-founder of PyTorch at Meta, reaffirmed the leak.</p> <p><a href="https://pub.towardsai.net/gpt-4-8-models-in-one-the-secret-is-out-e3d16fd1eee0"><strong>Visit Now</strong></a></p>
Tags: ChatGPT Secret