Kimi Introduces Price Cuts: Model Inference Service and Context Caching Experience Significant Reductions
2025-04-07 / Read about 0 minute
Author:小编   

The Kimi Open Platform has announced a pricing adjustment for its model inference service and context caching, leveraging the technological advancements and performance optimizations of Moonshot AI. This move represents a substantial reduction in the cost of context caching by up to 50%, aimed at decreasing expenses for developers and bolstering the competitiveness of its services.