TII Unveils Open-Source Giant Model Falcon H1R 7B: Exceptional Reasoning Capabilities with Just 700 Million Parameters
1 week ago / Read about 0 minute
Author:小编   

PingWest News, January 6th — The Abu Dhabi Technology Innovation Institute (TII) has just rolled out a cutting-edge open-source large model, the Falcon H1R 7B. Despite having a mere 700 million parameters, this model excels in benchmark tests for mathematics, programming, and scientific reasoning, outperforming some of its larger-scale counterparts. Employing a hybrid architecture that combines Transformer and Mamba, it attains an impressive reasoning throughput of 1,500 tokens per GPU per second. This efficiency makes it twice as fast as some of its rivals, rendering it an ideal choice for environments with limited computing power. Presently, both the full checkpoint and quantized versions of the model are available as open-source resources on the Hugging Face platform.

  • C114 Communication Network
  • Communication Home