Commit Graph

23 Commits

Author SHA1 Message Date
6aadae62aa doc/README: changelog for v1.1.0 2023-04-09 11:14:47 +12:00
92987735ae go mod tidy 2023-04-09 11:14:41 +12:00
082fb7552a doc/README: mention GOMAXPROCS is passed through to llama.cpp 2023-04-09 11:14:19 +12:00
2f4558b68e api: raise default context size from 512->1024 2023-04-09 11:12:16 +12:00
fac1a5b484 webui: new style 2023-04-09 11:11:01 +12:00
527fc92240 webui: add chat-style prompt
@ref https://github.com/ggerganov/llama.cpp/issues/771#issuecomment-1499900809
2023-04-08 19:22:42 +12:00
8e378c0734 webui: clamp input area width, default caret to end of textarea 2023-04-08 19:22:30 +12:00
575f7ac4bc webui: show current status in the frontend 2023-04-08 16:50:49 +12:00
5bbd203d31 doc/README: changelog for v1.0.0 2023-04-08 16:15:20 +12:00
defc784dd8 cflags/arm64: fix mcpu flag syntax 2023-04-08 16:07:58 +12:00
bb60bb989f doc/README: update features + api docs 2023-04-08 16:04:55 +12:00
3ff357b7d4 api: reduce log verbosity, log the time-per-token 2023-04-08 16:04:43 +12:00
07c5ca1015 webui: autoscroll new messages 2023-04-08 16:04:26 +12:00
f5ba37a10b api: simplify/combine the llama_eval branches 2023-04-08 16:04:16 +12:00
0c96f2bf6b api: support a MaxTokens parameter 2023-04-08 16:03:59 +12:00
6c6a5c602e api: llama_eval only needs to evaluate new tokens 2023-04-08 15:48:38 +12:00
2c11e32018 api: don't log resulting tokens on backend 2023-04-08 15:48:26 +12:00
2cdcf54dd8 webui: synchronize context size value for clientside warning 2023-04-08 15:48:16 +12:00
dc8db75e04 gitignore 2023-04-08 15:31:24 +12:00
a7dd9580a5 doc/README: initial commit 2023-04-08 15:30:37 +12:00
fa8db95cc6 doc/license: add MIT license 2023-04-08 15:30:32 +12:00
d044a9e424 initial commit 2023-04-08 15:30:15 +12:00
7c6a0cdaa2 llama.cpp: commit upstream files (as of rev 62cfc54f77e5190) 2023-04-08 15:30:02 +12:00