BeClaude
Back to News
Release2024-05-24

Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages

Source: Hugging Face

open-sourcemodels