LLaMA 65B
Large Language Model Meta AI, released in sizes up to 65B parameters for research. Achieved strong performance by training on 1.4T tokens; weights were leaked, spurring a wave of fine-tuned variants.
2023-02-01
65B
Decoder-only Transformer
Non-commercial (research-only)
Specifications
- Parameters
- 65B
- Architecture
- Decoder-only Transformer
- License
- Non-commercial (research-only)
- Context Window
- 2,048 tokens
- Type
- text
- Modalities
- text
Benchmark Scores
Advanced Specifications
- Model Family
- LLaMA
- API Access
- Not Available
- Chat Interface
- Not Available