backstage release v0.122.2

Changed

  • Stream reasoning tokens from OpenAI-compatible models (vLLM) to the AI chat UI by switching the aiChat.openai.api: chat path to @ai-sdk/openai-compatible and unconditionally rendering the Reasoning/ReasoningGroup slots in Thread.tsx. See ./docs/releases/v0.122.2-changelog.md for more information.