neovim config - native lsp support instead of coc
Migrating language server, linting and formatting away from coc
Migrating language server, linting and formatting away from coc
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Migrating a vimscript configuration to lua
Local AI assistant in vim with codellama
Update: Local AI assistant - llama 3
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant - codellama vs mistral
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Local AI assistant in vim with codellama
Update: Local AI assistant - llama 3
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant - codellama vs mistral
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Local AI assistant in vim with codellama
Update: Local AI assistant - llama 3
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant - codellama vs mistral
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Local AI assistant in vim with codellama
Update: Local AI assistant - llama 3
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant - codellama vs mistral
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Local AI assistant in vim with codellama
I bought a drevo calibur, a nice 71-key (60%) bluetooth-capable keyboard with mechanical switches.
The Problem
DISCLAMER: This article is pretty old and might be obsolote. If it still is of any use to somebody, good. But most likely, it isn’t.
Update: Local AI assistant in vim with codellama with iGPU accelerated inference - container image details
Update: Local AI assistant in vim with codellama - performance evaluation
Update: Local AI assistant in vim with codellama - now with iGPU accelerated inference
Two-Factor Authentication using your machine’s tpm module
‘Secure’ automatic system decrpytion with tpm and LUKS
Two-Factor Authentication using your machine’s tpm module
‘Secure’ automatic system decrpytion with tpm and LUKS
Update: Local AI assistant - codellama vs mistral
atuin server on a Raspberry PI - Cross compiling Rust
atuin server on a Raspberry PI - Cross compiling Rust
atuin server on a Raspberry PI - Cross compiling Rust