Codex
A GPT-3-descendent model fine-tuned on billions of lines of source code from publicly available sources, including code in public GitHub repositories. Powers GitHub Copilot for code autocompletion and demonstrated strong coding ability (solving ~28% of HumanEval at 12B params). Most capable in Python, but proficient in over a dozen languages including JavaScript, Go, Perl, PHP, Ruby, Swift, TypeScript, and Shell.
2021-08-10
12B
Decoder-only Transformer
Proprietary
Specifications
- Parameters
- 12B
- Architecture
- Decoder-only Transformer
- License
- Proprietary
- Context Window
- 4,096 tokens
- Type
- text
- Modalities
- text
Benchmark Scores
HumanEval28.8
Evaluates code generation capabilities by asking models to complete Python functions based on docstr...
Advanced Specifications
- Model Family
- Codex
- Finetuned From
- GPT-3
- API Access
- Not Available
- Chat Interface
- Not Available
Capabilities & Limitations
- Capabilities
- code