You don’t need hosted LLMs, do you?

<p>During the LLM hype, you can find a lot of articles like &ldquo;<em>Fine-tune your Private LLaMA/Falcon/Another Popular LLM</em>&rdquo;, &ldquo;<em>Train Your Own Private ChatGPT</em>&rdquo;, &ldquo;<em>How to Create a Local LLM</em>&rdquo; and others.</p> <p>At the same time, only few people tell why you need it. I mean,&nbsp;<strong>are you really sure you need your own self-hosted LLM?&nbsp;</strong>Maybe the OpenAI API could be the best choice for you.</p> <p><img alt="" src="https://miro.medium.com/v2/resize:fit:1000/1*-z4jrLqZF4MBaZaditRehg.png" style="height:407px; width:1000px" /></p> <p>&ldquo;Short&rdquo; summary of the article</p> <p>In this article, I will compare two approaches to using LLMs: making API calls to OpenAI versus deploying your own model. We will discuss aspects such as cost, text generation quality, development speed, and privacy.</p> <blockquote> <p><strong>Disclaimer</strong>: The information in the article is current as of August 2023, but please be aware that changes may occur thereafter.</p> </blockquote> <p><a href="https://betterprogramming.pub/you-dont-need-hosted-llms-do-you-1160b2520526"><strong>Learn More</strong></a></p>
Tags: LLMs hosted