CVE-2024-34359Improper Neutralization of Equivalent Special Elements in WEB UI

Severity
9.6CRITICALNVD
NVD8.4
EPSS
62.6%
top 1.62%
CISA KEV
Not in KEV
Exploit
No known exploits
Timeline
PublishedMay 14
Latest updateMar 9

Description

llama-cpp-python is the Python bindings for llama.cpp. `llama-cpp-python` depends on class `Llama` in `llama.py` to load `.gguf` llama.cpp or Latency Machine Learning Models. The `__init__` constructor built in the `Llama` takes several parameters to configure the loading and running of the model. Other than `NUMA, LoRa settings`, `loading tokenizers,` and `hardware settings`, `__init__` also loads the `chat template` from targeted `.gguf` 's Metadata and furtherly parses it to `llama_chat_forma

CVSS vector

CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:H/I:H/A:HExploitability: 2.8 | Impact: 6.0

Affected Packages2 packages

CVEListV5parisneo/parisneo_lollms-webuiunspecifiedlatest

🔴Vulnerability Details

3
GHSA
GHSA-325v-4jhh-rp36: parisneo/lollms-webui, in its latest version, is vulnerable to remote code execution due to an insecure dependency on llama-cpp-python version llama_c2024-07-02
GHSA
llama-cpp-python vulnerable to Remote Code Execution by Server-Side Template Injection in Model Metadata2024-05-13
OSV
llama-cpp-python vulnerable to Remote Code Execution by Server-Side Template Injection in Model Metadata2024-05-13

🕵️Threat Intelligence

1
Trendmicro
The Road to Agentic AI: Exposed Foundations2024-12-04

📐Framework References

1
CWE
Improper Neutralization of Special Elements Used in a Template Engine

📄Research Papers

2
arXiv
Inference-Time Backdoors via Hidden Instructions in LLM Chat Templates2026-03-09
arXiv
SASER: Stego attacks on open-source LLMs2025-10-12