CVE-2025-66448 — Code Injection in Vllm
Severity
8.8HIGHNVD
EPSS
0.3%
top 47.48%
CISA KEV
Not in KEV
Exploit
No known exploits
Affected products
Timeline
PublishedDec 1
Latest updateMar 27
Description
vLLM is an inference and serving engine for large language models (LLMs). Prior to 0.11.1, vllm has a critical remote code execution vector in a config class named Nemotron_Nano_VL_Config. When vllm loads a model config that contains an auto_map entry, the config class resolves that mapping with get_class_from_dynamic_module(...) and immediately instantiates the returned class. This fetches and executes Python from the remote repository referenced in the auto_map string. Crucially, this happens …
CVSS vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:HExploitability: 2.8 | Impact: 5.9