Skip to content

Pull requests: ggml-org/llama.vim

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

Bugfix : classic vim suffix duplicate
#7 by m18coppola was merged Nov 2, 2024 Loading…
llama.vim : disable temp sampler and set n_predict = 0
#25 by ggerganov was merged Jan 22, 2025 Loading…
add configuration doc
#30 by Frefreak was merged Jan 25, 2025 Loading…
Update README.md to use main version of llama.vim
#32 by jph00 was merged Jan 25, 2025 Loading…
Add ollama backend
#59 by akashjss was closed Mar 30, 2025 Loading…
llama.vim: speculative fim help wanted Extra attention is needed
#31 by VJHack was closed Mar 10, 2025 Draft
core : improve suggestions starting with empty lines
#51 by ggerganov was merged Mar 10, 2025 Loading…
core : decouple rendering from server requests
#52 by ggerganov was merged Mar 10, 2025 Loading…
core : add speculative fim
#53 by ggerganov was merged Mar 10, 2025 Loading…
readme : update llama-server command with presets
#46 by danbev was merged Feb 20, 2025 Loading…
add Vundle instructions
#8 by pnb was merged Nov 5, 2024 Loading…
add optional api key support
#5 by m18coppola was merged Oct 28, 2024 Loading…
add_quotations
#9 by PhilippKaesgen was merged Nov 15, 2024 Loading…
llama.vim : reduce max predict time to 0.5s
#12 by ggerganov was merged Nov 17, 2024 Loading…
Accept first word with Ctrl-B
#11 by m18coppola was merged Nov 19, 2024 Loading…
added warning when using unsupported vim version
#14 by m18coppola was merged Dec 16, 2024 Loading…
llama.vim : better request throttling mechanism
#19 by ggerganov was merged Jan 4, 2025 Loading…
cache: keep cached suggestions
#18 by VJHack was merged Jan 4, 2025 Loading…
Cache completion
#15 by VJHack was merged Dec 30, 2024 Loading…
llama.vim : cache results even if cursor moved
#22 by ggerganov was merged Jan 10, 2025 Loading…
allow to specify partial config
#27 by Frefreak was merged Jan 24, 2025 Loading…
Explicit llama.cpp server launch arguments
#33 by makuche was closed Jan 25, 2025 Loading…
Use stdin when calling curl
#39 by Frefreak was merged Jan 30, 2025 Loading…
ProTip! Type g i on any issue or pull request to go back to the issue listing page.