The curl URL parser wrongly accepts percent-encoded URL separators like /when decoding the host name part of a URL, making it a different URL usingthe wrong host name when it is later retrieved.For example, a URL like http://example.com%2F127.0.0.1/
, would be allowed bythe parser and get transposed into http://example.com/127.0.0.1/
. This flawcan be used to circumvent filters, checks and more.
Weakness
The product does not properly handle when all or part of an input has been URL encoded.
Affected Software
Name |
Vendor |
Start Version |
End Version |
Curl |
Haxx |
7.80.0 (including) |
7.83.1 (excluding) |
Curl |
Ubuntu |
devel |
* |
Curl |
Ubuntu |
jammy |
* |
Curl |
Ubuntu |
upstream |
* |
Potential Mitigations
- Assume all input is malicious. Use an “accept known good” input validation strategy, i.e., use a list of acceptable inputs that strictly conform to specifications. Reject any input that does not strictly conform to specifications, or transform it into something that does.
- When performing input validation, consider all potentially relevant properties, including length, type of input, the full range of acceptable values, missing or extra inputs, syntax, consistency across related fields, and conformance to business rules. As an example of business rule logic, “boat” may be syntactically valid because it only contains alphanumeric characters, but it is not valid if the input is only expected to contain colors such as “red” or “blue.”
- Do not rely exclusively on looking for malicious or malformed inputs. This is likely to miss at least one undesirable input, especially if the code’s environment changes. This can give attackers enough room to bypass the intended validation. However, denylists can be useful for detecting potential attacks or determining which inputs are so malformed that they should be rejected outright.
References