Smaller artificial intelligence models are making an appearance as downloadable programs on smartphones or laptops. They take up less space and are designed with a specific purpose. It makes everyday use of AI easier. Now Microsoft is getting in on the action with Phi-3, which will come in three versions: Mini, Small, and Medium.
According to Eric Boyd, vice president at Azure, Microsoft’s AI platform, Phi-3 Mini “will be as powerful as large language models like ChatGPT-3.5 (the current free version of ChatGPT, ed.), but in a smaller size,” he explained to The Verge.
In addition to Mini, there will also be Small and Medium models, each with a different number of parameters indicating how many complex instructions it can handle. For the Mini version, that is relatively limited to 3.8 billion parameters. Phi-3 Mini is available for download through Azure, Hugging Face and Ollama.
The Phi models were assembled with a particular ‘curriculum,’ Boyd said. For example, they were trained to produce children’s stories. “There are not enough children’s books, so we took a list of over 3,000 words and asked an LLM to create ‘children’s books’ to teach to Phi.”
Why exactly are tech companies bringing smaller AI models to market?
Boyd said that smaller models like Phi-3 work better for specific enterprise applications because their internal data sets are smaller anyway. And they also tend to be more affordable because these models use less computing power.
Other companies preceded Microsoft. Google launched Gemma, a slimmed-down version of Gemini. Meta also came out with a small version of its model Llama, and Claude 3 Haiku, in turn, can quickly summarize scientific papers.
Thus, tech companies diversify their AI models, dispelling the perception that AI is cumbersome and heavy. It increases accessibility because, as an end user, you know what you can use a particular AI model for.
News Wires
Subscribers 0
Fans 0
Followers 0
Followers