backstage release v0.122.2
Changed
- Stream reasoning tokens from OpenAI-compatible models (vLLM) to the AI chat UI by switching the
aiChat.openai.api: chatpath to@ai-sdk/openai-compatibleand unconditionally rendering theReasoning/ReasoningGroupslots inThread.tsx. See ./docs/releases/v0.122.2-changelog.md for more information.