I ran Gemma 4 as a local model in Codex CLI

I wanted to know whether Gemma 4 could replace a cloud model for my day-to-day agentic coding. Not in theory, in practice. I use Codex CLI…

Medium

Hey - I use the same, w/ both gemma4 and gpt-oss-*; some things I have to do for a good experience:

1) Pin to an earlier version of codex (sorry) - 0.55 is the best experience IME, but YMMV (see https://github.com/openai/codex/issues/11940, https://github.com/openai/codex/issues/8272).

2) Use the older completions endpoint (llama.cpp's responses support is incomplete - https://github.com/ggml-org/llama.cpp/issues/19138)

[Regression] gpt-oss no longer has access to apply_patch · Issue #11940 · openai/codex

What version of Codex CLI is running? 0.101.0 What subscription do you have? Business Which model were you using? gpt-oss-120b What platform is your computer? Archlinux What terminal emulator and v...

GitHub