Models
Compare
News
Skills
Tools
Guides
Search...
Back to News
Release
2024-05-24
Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages
Source:
Hugging Face
Read Original Article on Hugging Face
open-source
models