Blog

It's minimal, but I'm posting things.

LLM auto-complete plugin for Vim for OpenAI, Mistral, and other third party vendors

I needed LLM completions in Vim9 without running local models.
llama.vim is an excellent Vim plugin for LLM autocompletion.
But it expects a local model to be running.
My computers are not powerful enough to run local models, and I enjoy models provided by third party vendors.
So, I implemented vim-llama-adapter. It's a small Python server that makes llama.vim think it’s talking to a local model ; in reality the server is forwarding requests to remote LLM providers.

To run the application, configure your vimrc and start getting autocompletions, just check the README.md.

Good to know

  • I use Mistral’s FIM model codestral right now. You can get a free API key at the Mistral console.
  • Extending to other providers such as OpenAI is welcome — though this sort of thing works best with FIM models instead of Completion models. Some providers don’t support FIM natively; the adapter can emulate FIM via prompt engineering, but results vary, and as of the first release of the project, it works okay with Mistral, and poorly with OpenAI, nothing we can't fix though.

Cheers,


Published on 2025-12-03T22:21:34.938015Z
Last updated on 2025-12-03T22:21:34.938134Z