Microsoft has published a new blog post highlighting its efforts toward teaching small language models how to reason. A few months ago, the company debuted, Orca. “a 13-billion language model that demonstrated strong reasoning abilities by imitating the step-by-step reasoning traces of more capable LLMs.”

This article is imported via RSS from Windows Central RSS Feed – Read more here: ​Read More