Deploying and fine tuning LLMs in AWS for job listing summarisation
<p>It is not only OpenAI GPT models that glitter, and even more so when it comes to summarisation. That is the journey that we embarked on a few months ago when we started to work on the topic of job listing summarisation.</p>
<p>StepStone, one of the world’s leading job platforms, has for a long time built products revolving around data and AI is at the forefront of its strategy. One of our largest sources of data is job listing texts, and there are many reasons for which their summarisation is important for StepStone, namely:</p>
<ul>
<li>Allows users to preview the most important aspects of a job listing, permitting them to sneak peek whether they could be a good fit. Also, it provides a more digestible format for mobile application users.</li>
<li>Unifies writing style, which along a concise and clear text eases readability for job applicant that want a quick overview on the different existing offers.</li>
<li>Reduces computational time for calculating embeddings, useful for other use cases such as recommendation algorithms for finding suitable job offers for our users.</li>
</ul>
<p>As such, and given the different requirements existing for the different use cases along the organisation LLMs are a great solution. However, from the very beginning we have put our focus on open-source models or, alternatively, available through AWS (where we have full control over our infrastructure).</p>
<p><a href="https://medium.com/stepstone-tech/deploying-and-fine-tuning-llms-in-aws-for-job-listing-summarisation-c3e16e00aab3"><strong>Read More</strong></a></p>