Arabic-Centric Open Foundation and Instruction-Tuned Generative Large Language Models

<h1>The Release</h1> <p>We introduce Jais, a new state-of-the-art Arabic-centric large language model (LLM). Jais achieves the world&rsquo;s best performance for an open-source Arabic LLM while matching performance with English LLMs of similar size, despite having been trained on substantially less data. We release our model weights and inference code in both variants,&nbsp;<a href="https://huggingface.co/inception-mbzuai/jais-13b" rel="noopener ugc nofollow" target="_blank">base jais-13b</a>&nbsp;and the&nbsp;<a href="https://huggingface.co/inception-mbzuai/jais-13b-chat" rel="noopener ugc nofollow" target="_blank">instruction-tuned jais-chat-13b</a>, to the community. Our efforts represent a significant leap in the capabilities of Arabic generative AI, and the development of methods relevant to multilingual LLMs.</p> <p><a href="https://medium.com/@MohamedbinZayedAI/arabic-centric-open-foundation-and-instruction-tuned-generative-large-language-models-4386ba098273"><strong>Read More</strong></a></p>