Speak to me: How many words a model is reading

<p><a href="https://en.wikipedia.org/wiki/Large_language_model" rel="noopener ugc nofollow" target="_blank">LLMs</a>&nbsp;have shown their skills in recent months, demonstrating that they are proficient in a wide variety of tasks. All this through one mode of interaction: prompting.</p> <p>In recent months there has been a rush to broaden the context of language models.&nbsp;<strong>But how does this affect a language model?</strong></p> <p>This article is divided into different sections, for each section we will answer these questions:</p> <ul> <li>What is a prompt and how to build a good prompt?</li> <li>What is the context window? How long it can be? What is limiting the length of the input sequence of a model? Why this is important?</li> <li>How we can overcome these limitations?</li> <li>Do the models use the long context window?&nbsp;</li> </ul> <p><a href="https://towardsdatascience.com/speak-to-me-how-many-words-a-model-is-reading-331e3af86d27">Click Here</a>&nbsp;</p>
Tags: Reading model