CVE-2024-32878

Published Apr 26, 2024

Last updated 7 months ago

Overview

Description
Llama.cpp is LLM inference in C/C++. There is a use of uninitialized heap variable vulnerability in gguf_init_from_file, the code will free this uninitialized variable later. In a simple POC, it will directly cause a crash. If the file is carefully constructed, it may be possible to control this uninitialized value and cause arbitrary address free problems. This may further lead to be exploited. Causes llama.cpp to crash (DoS) and may even lead to arbitrary code execution (RCE). This vulnerability has been patched in commit b2740.
Source
security-advisories@github.com
NVD status
Awaiting Analysis

Risk scores

CVSS 3.1

Type
Secondary
Base score
7.1
Impact score
5.5
Exploitability score
1.6
Vector string
CVSS:3.1/AV:N/AC:H/PR:N/UI:R/S:U/C:H/I:H/A:L
Severity
HIGH

Weaknesses

security-advisories@github.com
CWE-456

Social media

Hype score
Not currently trending