At present, the realm of open-source AI large models is predominantly occupied by Chinese tech enterprises, with numerous American tech behemoths pivoting towards closed-source alternatives. To level the playing field and counterbalance the market dominance of Chinese firms, American tech giants have also made symbolic forays into the open-source arena. Among these, Google's upcoming Gemma 4 stands out as a notable contender. Built upon the robust Gemini 3.0 technology, this model boasts parameters that could potentially soar to 120 billion. It leverages a Mixture of Experts (MoE) architecture, featuring 15 billion active parameters, and is designed to operate seamlessly in local, offline environments.
