I first experienced AI-assistant development via GitHub Copilot during its beta in 2021. It was the first AI programming assistant back then and it only offered tab-completion, which I think was based on gpt-3.5.

Github Copilot lost it’s edge quickly, many new competitors were built and surpassed it. The second coding agent I tried was Cursor, also in its early beta days. It fels like a huge leap forward, though everything remained immature and the generated code rarely even compiled.

Fast forward to 2026, the number of projects exploded. We now have Claude Code, Cline, Kilo Code, Hermes Agent, Zed, Codex, Aider, Windsurf, Gemini CLI, and many others. I regularly test new projects when they offer genuinely interesting new features, though such unique features are becoming increasingly rare. After years of experimentation, I’ve settled on just one tool for AI-assisted programming: OpenCode.

I am genuinely enthusiastic about OpenCode. The UI is great, it includes all the features I actually need, and it’s open source. The agent framework feels good, even if I disagree with their assumption that every LLM performs best without adjusting the harness for each model. I love the flexibility to switch between models and providers, plus the ability to leverage existing subscriptions.

Most importantly, I trust that the team behind the project won’t sell out and abandon open source. Initially, the project originated with three developers in the opencode-ai/opencode repository. When one founder wanted to sell and the other two refused, those two created anomalyco/opencode (initally sst/opencode). This fork has since become the clear winner between the two competing projects. After a period of shared naming (which led to a lot of confusion), the original project rebranded as Crush. While Crush maintains respectable traction with 22k GitHub stars, it is overshadowed by comparison to the “real” OpenCode with 132k stars.

The open source nature also lead to the creation of opencode.nvim, a neovim extension that filled the missing piece to things like Cursor, where I was able to select code and ask specific questions about it and the code was added to the context automatically, which the extension allows you to do. Since I adopted the plugin, I havent missed other IDEs.

… Besides one gap that is still remaining in neovim: good tab completion. Cursor excels here, their completion model is remarkably good, and I admire the technical approach of continuous fine-tuning with updates every 30 minutes (Reading recommendation: tab-rl).

Neovim introduced a new API for inline completion in (I think) v0.11, but tool adoption remains limited. I’m only aware of two maintained implementations:

  • Github Copilot Language Server
  • llama.vim, which builds on llama.cpp (the foundation projects for most local LLM projects)

llama.vim works well in theory, but without a good GPU its painfully slow. The available models are also constrained, both in quantity and capability.

Supermaven once filled this niche for Neovim users, but Cursor acquired it two years ago (which is why Cursor is now the leader in tab completion). Even though their website confusingly still sugests otherwise, new useres can no longer sign up for it.

Supermaven was temporarily planned to be sunsetted even for existing users, but a lot of people were frustrated about it. OpenCode’s creators hinted that they plan on creating their own alternative for it (which I would have loved), but since Supermaven reverted their decision and now provide free inference for all existing users, I dont think that it’s a priority for them, unfortunately.

In my opinion the industry rushed too quickly from tab-completion to async coding agents. For me, the next major productivity gains won’t come from better models but from a plugin that nails tab-completion. It would strike the ideal balance for me: I code dramatically faster without offloading my understanding of the problem. Cognitive debt is a big concern of mine with async coding agents, which is why I use them very selectively only. I hope this gap closes soon.