Go
Home
Pricing
FAQ
Install
Home
Pricing
FAQ
Install
Login
via GitHub
abetlen/llama-cpp-python
Pull Requests
Commits
Open
Closed
Support LoRA hotswapping and multiple LoRAs at a time
#1817 opened 2024-10-30 10:34 by
richdougherty
fix: additional_files support glob patterns
#1800 opened 2024-10-16 05:50 by
xianml
Fixed typo at line 340 of README.md
#1791 opened 2024-10-09 20:14 by
Victoran0
server types: Move 'model' parameter to clarify it is used
#1786 opened 2024-10-05 23:54 by
domdomegg
Fix LLAVA_CPP_LIB creating empty path
#1782 opened 2024-10-03 09:18 by
navratil-matej
Add Paligemma support
#1777 opened 2024-10-02 08:50 by
abetlen
Update Dockerfile
#1776 opened 2024-10-01 11:30 by
Smartappli
fix: handle multiple calls to the same tool
#1758 opened 2024-09-24 15:25 by
jeffmaury
corrected command
#1739 opened 2024-09-12 08:58 by
Shehrozkashif
Resync llama_grammar with llama.cpp implementation and use curly braces quantities instead of repetitions
#1721 opened 2024-08-31 20:39 by
gbloisi-openaire
Remove unnecessary pyproject optional dependency
#1718 opened 2024-08-30 17:13 by
LecrisUT
feat: adding support for external chat format contribution
#1716 opened 2024-08-29 20:12 by
axel7083
Allow server to accept openai's new structured output "json_schema" format.
#1677 opened 2024-08-13 07:53 by
cerealbox
🚀 Add Ruff Linter
#1651 opened 2024-08-02 15:16 by
Smartappli
Support images from local storage for Llava models
#1583 opened 2024-07-09 09:28 by
GokulMuraliRajasekar
Add stream_options support according to OpenAI API
#1552 opened 2024-06-25 06:42 by
tpfau
Change server approach to handle parallel requests
#1550 opened 2024-06-24 08:00 by
sergey-zinchenko
Workflow update - PART 2
#1515 opened 2024-06-06 22:21 by
Smartappli
Integrate Functionary v2.5 + Refactor Functionary Code
#1509 opened 2024-06-05 15:36 by
jeffrey-fong
Support parallel function calls with tool_choice
#1503 opened 2024-06-02 20:55 by
CISC
Chat template rendering extensions to match transformers
#1486 opened 2024-05-26 06:56 by
CISC
Support multiple chat templates - step 2
#1440 opened 2024-05-09 23:08 by
CISC
LLaMA cpp python server: IPV6 support
#1427 opened 2024-05-03 20:18 by
Smartappli
fix: add binding for name in ChatCompletionRequestToolMessage
#1407 opened 2024-04-29 20:34 by
JDScript
Add the Command R chat format
#1382 opened 2024-04-25 10:22 by
euxoa
Improve function calling (auto selection, parallel functions)
#1351 opened 2024-04-17 13:35 by
themrzmaster
Feature: Lightweight llama_cpp.server Docker Image Build Workflow
#1331 opened 2024-04-05 16:56 by
devcxl
Add user-assistant chat format
#1281 opened 2024-03-17 21:16 by
DrewWalkup
Exposes json_schema_to_gbnf method for importing from module
#1212 opened 2024-02-23 12:23 by
lukestanley
WIP: Parallel generation implemenation
#1209 opened 2024-02-22 05:15 by
iamlemec
Newer
Older