Emergent Abilities in AI: Are We Chasing a Myth?

<p><a href="https://en.wikipedia.org/wiki/Emergence" rel="noopener ugc nofollow" target="_blank">Emergent properties</a>&nbsp;are not only a concept that belongs to artificial intelligence but to all disciplines (from physics to biology). This concept has always fascinated scientists, both in describing and trying to understand the origin. Nobel Prize-winning physicist&nbsp;<a href="https://en.wikipedia.org/wiki/Philip_W._Anderson" rel="noopener ugc nofollow" target="_blank">P.W. Anderson</a>&nbsp;synthesized the idea with &ldquo;More Is Different.&rdquo; In a certain, sense it can be defined as an emergent property, a property that appears as the complexity of the system increases and cannot be predicted.</p> <p>For example, you can encode information with a small molecule, but DNA (a large molecule) is encoding a genome. Or a&nbsp;<a href="https://bounded-regret.ghost.io/future-ml-systems-will-be-qualitatively-different/" rel="noopener ugc nofollow" target="_blank">small amount of Uranium</a>&nbsp;is not leading to a nuclear reaction.</p> <p>&nbsp;</p> <p>&ldquo;The formation of complex symmetrical and&nbsp;<a href="https://en.wikipedia.org/wiki/Fractal" rel="noopener ugc nofollow" target="_blank">fractal</a>&nbsp;<a href="https://en.wikipedia.org/wiki/Patterns_in_nature" rel="noopener ugc nofollow" target="_blank">patterns</a>&nbsp;in&nbsp;<a href="https://en.wikipedia.org/wiki/Snowflake" rel="noopener ugc nofollow" target="_blank">snowflakes</a>&nbsp;exemplifies emergence in a physical system&rdquo;. image source:&nbsp;<a href="https://en.wikipedia.org/wiki/Emergence" rel="noopener ugc nofollow" target="_blank">here</a></p> <p>Recently the same behavior has been observed with artificial intelligence models, one of the most commonly&nbsp;<a href="https://arxiv.org/abs/2206.07682" rel="noopener ugc nofollow" target="_blank">used definitions being</a>: &ldquo;An ability is emergent if it is not present in smaller models but is present in larger models.&rdquo;</p> <blockquote> <p>What does this mean and how is it observed?</p> </blockquote> <p>OpenAI stated in an article that the performance of a model follows a&nbsp;<a href="https://arxiv.org/abs/2001.08361" rel="noopener ugc nofollow" target="_blank">scaling law</a>: the more data and parameters, the better the performance. In the case of emergent properties, what is expected is a particular pattern: as the number of parameters increases, performance is almost random until at a certain threshold a certain property is observed (performance begins to improve noticeably). Basically, we see a sharp turn of the curve (called phase transition). This also is called emergent, because it is impossible to predict by examining a small-scale model.</p> <p><a href="https://towardsdatascience.com/emergent-abilities-in-ai-are-we-chasing-a-myth-fead754a1bf9">Click Here</a></p>
Tags: Myth emergent