CVE-2025-25183 — Improper Validation of Integrity Check Value in Vllm
Severity
2.6LOWNVD
EPSS
0.3%
top 44.57%
CISA KEV
Not in KEV
Exploit
No known exploits
Affected products
Timeline
PublishedFeb 7
Latest updateFeb 11
Description
vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Maliciously constructed statements can lead to hash collisions, resulting in cache reuse, which can interfere with subsequent responses and cause unintended behavior. Prefix caching makes use of Python's built-in hash() function. As of Python 3.12, the behavior of hash(None) has changed to be a predictable constant value. This makes it more feasible that someone could try exploit hash collisions. The impact of …
CVSS vector
CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:U/C:N/I:L/A:NExploitability: 1.2 | Impact: 1.4
Affected Packages4 packages
🔴Vulnerability Details
3OSV▶
CVE-2025-25183: vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs↗2025-02-07
GHSA▶
vLLM uses Python 3.12 built-in hash() which leads to predictable hash collisions in prefix cache↗2025-02-06
OSV▶
vLLM uses Python 3.12 built-in hash() which leads to predictable hash collisions in prefix cache↗2025-02-06