Skip to content

Misc. bug: Librechat + Filesystem MCP + llama-server Fails Due to Invalid Schema #17574

@chadvoegele

Description

@chadvoegele

Name and Version

chad@chaddev:~/code/github.com/chadvoegele/llama.cpp>> ./build/bin/llama-server --version
ggml_cuda_init: found 1 CUDA devices:
  Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes
version: 7188 (7c6980ae)
built with cc (GCC) 15.2.1 20251112 for x86_64-pc-linux-gnu

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-server

Command line

@chaddev:~/code/github.com/chadvoegele/llama.cpp>> ./build/bin/llama-server -v -hf ggml-org/gpt-oss-120b-GGUF --host 0.0.0.0 --port 8080 --ctx-size 0 --jinja -ub 2048 -b 2048 -ncmoe 22 --jinja --api-key-file <(pass llama-server-api-key)

Problem description & steps to reproduce

I'm running LibreChat + llama-server with gpt-oss-120b for a fully local agent. Everything works great but when I add the Filesystem MCP tool in LibreChat, the request starts failing.

I looked into the logs and found that it's because LibreChat + Filesystem MCP sends a not: {} in the schema, which is unsupported by llama-server. Further since llama-server returns a 500, LibreChat just retries and then fails with an unknown error.

Since not in the schema is unlikely to be implemented, I'm proposing to fix by

  1. Changing the invalid schema to a 400 in llama-server in PR 17572
  2. Removing the superfluous not: {} from LibreChat + Filesystem MCP

Although not a fix, this hack makes in common/json-schema-to-grammar.cpp everything hum.

+        } else if (schema.dump() == "{\"not\":{}}") {
+            return "";

First Bad Commit

No response

Relevant log output

srv  log_server_r: request:  {"model":"gpt-oss","user":"689567a9ec7a4187cb32d064","stream":true,"tools":[{"type":"function","function":{"name":"read_file_mcp_filesystem","description":"Read the complete contents of a file as text. DEPRECATED: Use read_text_file instead.","parameters":{"type":"object","properties":{"path":{"type":"string"},"tail":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the last N lines of the file"}],"description":"If provided, returns only the last N lines of the file"},{"type":"null"}],"description":"If provided, returns only the last N lines of the file"},"head":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the first N lines of the file"}],"d ... ],"messages":[{"role":"system","content":"[object Promise]"},{"role":"user","content":"hi show me the files"}]}

srv  log_server_r: response: {"error":{"code":500,"message":"JSON schema conversion failed:\nUnrecognized schema: {\"not\":{}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)server/api

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions