You don’t need hosted LLMs, do you?
<p>During the LLM hype, you can find a lot of articles like “<em>Fine-tune your Private LLaMA/Falcon/Another Popular LLM</em>”, “<em>Train Your Own Private ChatGPT</em>”, “<em>How to Create a Local LLM</em>” and others.</p>
<p>At the same time, only few people tell why you need it. I mean, <strong>are you really sure you need your own self-hosted LLM? </strong>Maybe the OpenAI API could be the best choice for you.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:1000/1*-z4jrLqZF4MBaZaditRehg.png" style="height:407px; width:1000px" /></p>
<p>“Short” summary of the article</p>
<p>In this article, I will compare two approaches to using LLMs: making API calls to OpenAI versus deploying your own model. We will discuss aspects such as cost, text generation quality, development speed, and privacy.</p>
<blockquote>
<p><strong>Disclaimer</strong>: The information in the article is current as of August 2023, but please be aware that changes may occur thereafter.</p>
</blockquote>
<p><a href="https://betterprogramming.pub/you-dont-need-hosted-llms-do-you-1160b2520526"><strong>Learn More</strong></a></p>