Open-Source Foundation Models
We have seen an explosion of open-source foundation models with the likes of Llama-2, Falcon, and Bloom, to name a few. However, the largest of these models are pretty much impossible to use for a person of modest means.
Large language models have a large number of parameters. Take Llama-2 for instance, the largest version of it has 70 billion parameters.
The scale of these models ensures that for most researchers, hobbyists or engineers, the hardware requirements are a significant barrier.
If you’re reading this I gather you have probably tried but you have been unable to use these models. Let’s look at the hardware requirements for Meta’s Llama-2 to understand why that is.