Minerva LLMs
The first family of Large Language Models pretrained from scratch on Italian!
Stay tuned for the technical report on Minerva!
Our Minerva Models
Minerva-350M-base-v1.0
This compact model is fast and agile, making it ideal for applications requiring quick responses and lower computational resources. With dual-language training in Italian and English, it's perfectly suited for fine-tuning tasks in specialized domains where tailored responses are crucial.
Minerva-1B-base-v1.0
Featuring a balanced blend of depth and speed, this 1 billion-parameter model provides a robust framework for a variety of applications. It strikes an effective balance between performance and resource consumption, making it suitable for developing more complex, multi-faceted language tasks without extensive computational overhead.
Minerva-3B-base-v1.0
This powerful and comprehensive model is trained on an extensive corpus of Italian and English text, enabling sophisticated understanding and generation of language. With its vast knowledge base and deep learning capabilities, it is optimized for delivering high-quality, general-purpose language solutions across a broad range of contexts and applications.