OpenClaw v2026.4.14 Released — GPT-5.4 support, Telegram topics, and core reliability fixes
OpenClaw v2026.4.14 adds forward-compat GPT-5.4 support, richer Telegram topic context, Codex catalog fixes, and broader reliability work across models, tools, and local runs.
OpenClaw v2026.4.14 was published on 2026-04-14T13:03:29Z by vincentkoc. OpenClaw v2026.4.14 adds forward-compat GPT-5.4 support, richer Telegram topic context, Codex catalog fixes, and broader reliability work across models, tools, and local runs.
Quick answer
This release matters most for teams using GPT-5 family models, Telegram forum workflows, Codex catalog entries, and slower local Ollama runs that previously timed out too aggressively.
What's new in this release
- OpenAI Codex/models: add forward-compat support for `gpt-5.4-pro`, including Codex pricing/limits and list/status visibility before the upstream catalog catches up. (#66453) Thanks @jepson-liu.
- Telegram/forum topics: surface human topic names in agent context, prompt metadata, and plugin hook metadata by learning names from Telegram forum service messages. (#65973) Thanks @ptahdunbar.
- Agents/Ollama: forward the configured embedded-run timeout into the global undici stream timeout tuning so slow local Ollama runs no longer inherit the default stream cutoff instead of the operator-set run timeout. (#63175) Thanks @mindcraftreader and @vincentkoc.
- Models/Codex: include `apiKey` in the codex provider catalog output so the Pi ModelRegistry validator no longer rejects the entry and silently drops all custom models from every provider in `models.json`. (#66180) Thanks @hoyyeva.
- Tools/image+pdf: normalize configured provider/model refs before media-tool registry lookup so image and PDF tool runs stop rejecting valid Ollama vision models as unknown just because the tool path skipped the usual model-ref normalization step. (#59943) Thanks @yqli2420 and @vincentkoc.
Why developers should care
Teams using OpenAI Codex models, Telegram forum topics, Ollama, or media-tool workflows should review this release closely.
Upgrade notes
The fetched GitHub release notes mention breaking-change risk. Review the full release notes carefully before upgrading production agents, channels, or plugin-heavy environments.
Source
Full release notes: https://github.com/openclaw/openclaw/releases/tag/v2026.4.14
Frequently asked questions
What changed in OpenClaw v2026.4.14?
OpenAI Codex/models: add forward-compat support for `gpt-5.4-pro`, including Codex pricing/limits and list/status visibility before the upstream catalog catches up. (#66453) Thanks @jepson-liu.; Telegram/forum topics: surface human topic names in agent context, prompt metadata, and plugin hook metadata by learning names from Telegram forum service messages. (#65973) Thanks @ptahdunbar.; Agents/Ollama: forward the configured embedded-run timeout into the global undici stream timeout tuning so slow local Ollama runs no longer inherit the default stream cutoff instead of the operator-set run timeout. (#63175) Thanks @mindcraftreader and @vincentkoc.
Does OpenClaw v2026.4.14 include breaking changes?
No explicit breaking changes were called out in the fetched GitHub release notes. Review the full release notes before upgrading provider-heavy, channel-heavy, or plugin-heavy environments: https://github.com/openclaw/openclaw/releases/tag/v2026.4.14
Who should pay attention to OpenClaw v2026.4.14?
Teams using OpenAI Codex models, Telegram forum topics, Ollama, or media-tool workflows should review this release closely.