Microsoft logo

Phi-2

MicrosoftOpen SourcePending Verification

2.7B-parameter model focused on efficiency and quality by training on high-quality (including synthetic) data. Achieved surprisingly strong results for its size, dubbed a "textbook-quality" model.

2023-12-01
2.7B
Decoder-only Transformer
MIT

Specifications

Parameters
2.7B
Architecture
Decoder-only Transformer
License
MIT
Context Window
2,048 tokens
Type
text
Modalities
text

Benchmark Scores

Advanced Specifications

Model Family
Phi
API Access
Not Available
Chat Interface
Not Available

Capabilities & Limitations

Related Models