T5 (Text-To-Text Transfer Transformer)
Encoder-decoder Transformer trained on a unified text-to-text format for diverse NLP tasks. Enabled prompt-based multitask learning.
2019-10-01
11B
Encoder–Decoder Transformer (Seq2Seq)
Apache 2.0
Specifications
- Parameters
- 11B
- Architecture
- Encoder–Decoder Transformer (Seq2Seq)
- License
- Apache 2.0
- Context Window
- 512 tokens
- Type
- text
- Modalities
- text
Benchmark Scores
Advanced Specifications
- Model Family
- T5
- API Access
- Not Available
- Chat Interface
- Not Available