Phi-4-reasoningMicrosoft
|
Universal Sentence EncoderTensorflow
|
|||||
Related Products
|
||||||
About
Phi-4-reasoning is a 14-billion parameter transformer-based language model optimized for complex reasoning tasks, including math, coding, algorithmic problem solving, and planning. Trained via supervised fine-tuning of Phi-4 on carefully curated "teachable" prompts and reasoning demonstrations generated using o3-mini, it generates detailed reasoning chains that effectively leverage inference-time compute. Phi-4-reasoning incorporates outcome-based reinforcement learning to produce longer reasoning traces. It outperforms significantly larger open-weight models such as DeepSeek-R1-Distill-Llama-70B and approaches the performance levels of the full DeepSeek-R1 model across a wide range of reasoning tasks. Phi-4-reasoning is designed for environments with constrained computing or latency. Fine-tuned with synthetic data generated by DeepSeek-R1, it provides high-quality, step-by-step problem solving.
|
About
The Universal Sentence Encoder (USE) encodes text into high-dimensional vectors that can be utilized for tasks such as text classification, semantic similarity, and clustering. It offers two model variants: one based on the Transformer architecture and another on Deep Averaging Network (DAN), allowing a balance between accuracy and computational efficiency. The Transformer-based model captures context-sensitive embeddings by processing the entire input sequence simultaneously, while the DAN-based model computes embeddings by averaging word embeddings, followed by a feedforward neural network. These embeddings facilitate efficient semantic similarity calculations and enhance performance on downstream tasks with minimal supervised training data. The USE is accessible via TensorFlow Hub, enabling seamless integration into various applications.
|
|||||
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
|||||
Audience
Educators and developers seeking an AI tool offering models for complex reasoning tasks
|
Audience
Data scientists and machine learning engineers seeking a tool to optimize their natural language processing models with robust sentence embeddings
|
|||||
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
|||||
API
Offers API
|
API
Offers API
|
|||||
Screenshots and Videos |
Screenshots and Videos |
|||||
Pricing
No information available.
Free Version
Free Trial
|
Pricing
No information available.
Free Version
Free Trial
|
|||||
Reviews/
|
Reviews/
|
|||||
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
|||||
Company InformationMicrosoft
Founded: 1975
United States
azure.microsoft.com/en-us/blog/one-year-of-phi-small-language-models-making-big-leaps-in-ai/
|
Company InformationTensorflow
Founded: 2015
United States
www.tensorflow.org/hub/tutorials/semantic_similarity_with_tf_hub_universal_encoder
|
|||||
Alternatives |
Alternatives |
|||||
|
|
|||||
|
|
|||||
|
|
|||||
|
||||||
Categories |
Categories |
|||||
Integrations
Azure AI Foundry
Google Colab
Hugging Face
Microsoft Azure
TensorFlow
|
Integrations
Azure AI Foundry
Google Colab
Hugging Face
Microsoft Azure
TensorFlow
|
|||||
|
|