Phi-3, Microsoft launched its smallest AI model

microsoft launched phi-3
Photo by Salvatore De Lellis : https://www.pexels.com/photo/glass-panels-exterior-of-the-microsoft-building-9683980/

Microsoft has announced the release of Phi-3 Mini, a highly advanced model in its lineup of small language models (SLMs). This new addition combines high efficiency with robust performance capabilities, typically seen in larger models.  

The Phi-3 Mini, with its 3.8 billion parameters, is designed to deliver exceptional performance, particularly in environments where computing resources are limited or where high-speed processing is critical. 

Phi-3 Mini is now available on various platforms, including Microsoft Azure AI Studio, Hugging Face, and Ollama, making it accessible for a wide range of developers and industries.  

This model is unique because it supports an extensive context window of up to 128,000 tokens, setting a new standard for SLMs by allowing more extensive and complex data processing with minimal quality compromise. 

Moreover, the Phi-3 Mini is “instruction-tuned,” meaning it has been trained to understand and execute a range of instructions that mimic human-like communication, ensuring it performs efficiently. This feature is particularly beneficial for developers looking to integrate the model into applications without extensive additional training. 

In terms of compatibility and optimization, Phi-3 Mini is highly capable across various platforms. It is optimized for ONNX Runtime, enhancing its performance on a broad spectrum of hardware from GPUs and CPUs to mobile devices. Additionally, it is compatible with Windows DirectML and can be deployed as an NVIDIA NIM microservice, offering a flexible API interface for easy integration anywhere. 

Looking ahead, Microsoft plans to expand the Phi-3 series. In the coming weeks, they will introduce Phi-3 Small and Phi-3 Medium models, each designed to provide more choices and flexibility for users. 

The development of the Phi-3 Mini is guided by Microsoft’s Responsible AI Standards, which emphasize safety, reliability, and fairness. The model has undergone extensive safety evaluation, to ensure it adheres to the highest ethical standards of AI development and deployment. 

Although Phi-3 performs better than a lot of other models of the same size, these SLMs are trained on very small training date, hence these models do not perform as in depth as LLMs. One of the main advantages of these small models is that these models are really cost effective and can be easily utilized on small devices. 

Ece Kamar, a Microsoft vice president who leads the Microsoft Research AI Frontiers Lab says: 
The claim here is not that SLMs are going to substitute or replace large language models,” said Kamar. Instead, SLMs “are uniquely positioned for computation on the edge, computation on the device, computations where you don’t need to go to the cloud to get things done. That’s why it is important for us to understand the strengths and weaknesses of this model portfolio.” 

In AI, Microsoft’s competitors have also released small models, whether its Google’s Gemma or Claude 3 Haiku. Recently, Meta released Llama 3, a model with 8B parameters. 

The application of Phi-3 Mini is expected to revolutionize sectors where quick data processing is crucial. It can be particularly impactful in areas such as agriculture, where it can assist in analyzing crop data on-site, or in urban planning, where real-time data interpretation can enhance decision-making processes. In this sector, ITC, a leading business in India is collaborating with Microsoft, aiming to utilize Phi-3 in its farming-based app, Krishi Mitra. 

Our goal with the Krishi Mitra copilot is to improve efficiency while maintaining the accuracy of a large language model. We are excited to partner with Microsoft on using fine-tuned versions of Phi-3 to meet both our goals—efficiency and accuracy!”    

Saif Naik, Head of Technology, ITCMAARS 

Moreover, Microsoft’s commitment to accessible AI is reflected in the deployment options for Phi-3 Mini, which include cloud-based solutions and on-device integrations, ensuring organizations can leverage powerful AI tools regardless of their infrastructure or budget. 

With its advanced capabilities and ethical alignment, Phi-3 Mini is poised to be a pivotal tool, driving innovation and efficiency across various applications and that too, being cost-effective.