Mozilla, via its Builders initiative, has introduced LocalScore, a benchmarking tool specifically designed for local AI large language models (LLMs) compatible with both Windows and Linux operating systems. Leveraging Mozilla's Ocho Llamafile project, LocalScore aims to deliver a readily distributable framework for LLMs. Though still in its nascent stages, LocalScore has already yielded promising outcomes. It has been seamlessly integrated as a new feature within Llamafile and introduces LocalScore.ai as an optional repository for storing CPU/GPU benchmark results. These results are derived from the official Mini/Small/Medium Meta Llama 3.1 models. Accessible either through the Llamafile package or via a standalone LocalScore binary, LocalScore provides a straightforward and portable means to benchmark AI on both Windows and Linux, serving as a handy tool for assessing the performance of LLM systems.
