Skip Navigation

Baichuan 7B reaches top of LLM leaderboard for it's size?

github.com GitHub - baichuan-inc/baichuan-7B: A large-scale 7B pretraining language model developed by BaiChuan-Inc.

A large-scale 7B pretraining language model developed by BaiChuan-Inc. - GitHub - baichuan-inc/baichuan-7B: A large-scale 7B pretraining language model developed by BaiChuan-Inc.

baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).

GitHub: https://github.com/baichuan-inc/baichuan-7B

Hugging Face: https://huggingface.co/baichuan-inc/baichuan-7B

1
1 comments