LLM Economics: ChatGPT vs Open-Source
<p>TLDR: For lower usage in the 1000’s of requests per day range ChatGPT works out cheaper than using open-sourced LLMs deployed to AWS. For millions of requests per day, open-sourced models deployed in AWS work out cheaper. (As of writing this article on April 24th, 2023.)</p>
<p>Large Language Models are taking the world by storm. Transformers were introduced in 2017, followed by breakthrough models like BERT, GPT, and BART — 100’s of millions of parameters; and capable to performing multiple Language tasks like sentiment analysis, Q&A, classification, etc.</p>
<p>A couple of years ago — researchers from OpenAI and Google documented multiple papers showing that large language models with more than 10’s of Billions of parameters started showing emergent capabilities where they seemingly understand complex aspects of language and are almost human-like in their responses.</p>
<p><a href="https://towardsdatascience.com/llm-economics-chatgpt-vs-open-source-dfc29f69fec1"><strong>Click Here</strong></a></p>