Megatron-Turing NLG 530B
A 530-billion-parameter Transformer language model, one of the largest dense LLMs of 2021. Collaboration between NVIDIA and Microsoft, used for research on scaling.
2021-10-01
530B
Decoder-only Transformer
Proprietary (restricted access)
Specifications
- Parameters
- 530B
- Architecture
- Decoder-only Transformer
- License
- Proprietary (restricted access)
- Context Window
- 2,048 tokens
- Type
- text
- Modalities
- text
Benchmark Scores
Advanced Specifications
- Model Family
- Megatron
- API Access
- Not Available
- Chat Interface
- Not Available