v2.4.2

January 22, 2026

Standardize model integrations with OpenAI Responses API compatibility

We introduced OpenAI Responses API–compatible clients, including a base OpenResponses and provider-specific clients for Ollama and OpenRouter. This gives teams a consistent request/response schema across local and hosted models, simplifying migrations and reducing provider-specific branching. The result is faster adoption, cleaner integrations, and more flexibility to switch or mix models without refactoring.

Details

  • One API shape across multiple providers for better portability and governance
  • Supports self-hosted (Ollama) and hosted marketplaces (OpenRouter)
  • No breaking changes — upgrade and start using Responses-compatible clients

Who this is for: Platform teams running hybrid model stacks and organizations seeking vendor flexibility with minimal integration overhead.