Fine-tuning LLMs
<h1>Catastrophic Forgetting (degrades model performance)</h1>
<p>Catastrophic forgetting occurs when a machine learning model forgets previously learned information as it learns new information.</p>
<p>This process is especially problematic in sequential learning scenarios where the model is trained on multiple tasks over time.</p>
<p>Catastrophic forgetting is a common problem in machine learning, especially in deep learning models.</p>
<h2>Example</h2>
<p>A sentiment judgment task. We fine-tune the model to give sentiment results instead of sentences, and it works.</p>
<p><a href="https://teetracker.medium.com/fine-tuning-llms-9fe553a514d0"><strong>Website</strong></a></p>