The Kimi Open Platform has announced a pricing adjustment for its model inference service and context caching, leveraging the technological advancements and performance optimizations of Moonshot AI. This move represents a substantial reduction in the cost of context caching by up to 50%, aimed at decreasing expenses for developers and bolstering the competitiveness of its services.
