|
f5ba37a10b
|
api: simplify/combine the llama_eval branches
|
2023-04-08 16:04:16 +12:00 |
|
|
0c96f2bf6b
|
api: support a MaxTokens parameter
|
2023-04-08 16:03:59 +12:00 |
|
|
6c6a5c602e
|
api: llama_eval only needs to evaluate new tokens
|
2023-04-08 15:48:38 +12:00 |
|
|
2c11e32018
|
api: don't log resulting tokens on backend
|
2023-04-08 15:48:26 +12:00 |
|
|
2cdcf54dd8
|
webui: synchronize context size value for clientside warning
|
2023-04-08 15:48:16 +12:00 |
|
|
d044a9e424
|
initial commit
|
2023-04-08 15:30:15 +12:00 |
|