PaLM 2
Second-generation PaLM, actually a family of models (up to ~340B dense params) with training on a diverse multilingual dataset. Powers Google's Bard and other products; improved coding, reasoning, and multilingual skills over PaLM.
2023-05-01
340B
Decoder-only Transformer
Proprietary
Specifications
- Parameters
- 340B
- Architecture
- Decoder-only Transformer
- License
- Proprietary
- Context Window
- 4,096 tokens
- Type
- text
- Modalities
- text
Benchmark Scores
Advanced Specifications
- Model Family
- PaLM
- API Access
- Not Available
- Chat Interface
- Not Available