CVE-2025-48942 — Uncaught Exception in Vllm
Severity
6.5MEDIUMNVD
EPSS
0.2%
top 56.55%
CISA KEV
Not in KEV
Exploit
No known exploits
Affected products
Timeline
PublishedMay 30
Description
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
CVSS vector
CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:HExploitability: 2.8 | Impact: 3.6
Affected Packages3 packages
Patches
🔴Vulnerability Details
4OSV
▶
OSV
▶