Deserialization of Untrusted Data, Inclusion of Functionality from Untrusted Control Sphere vulnerability in Apache Software Foundation Apache Airflow Spark Provider.
When the Apache Spark provider is installed on an Airflow deployment, an Airflow user that is authorized to configure Spark hooks can effectively run arbitrary code on the Airflow node by pointing it at a malicious Spark server. Prior to version 4.1.3, this was not called out in the documentation explicitly, so it is possible that administrators provided authorizations to configure Spark hooks without taking this into account. We recommend administrators to review their configurations to make sure the authorization to configure Spark hooks is only provided to fully trusted users.
To view the warning in the docs please visit https://airflow.apache.org/docs/apache-airflow-providers-apache-spark/4.1.3/connections/spark.html
The product deserializes untrusted data without sufficiently verifying that the resulting data will be valid.
Name | Vendor | Start Version | End Version |
---|---|---|---|
Airflow_spark_provider | Apache | * | 4.1.3 (excluding) |
It is often convenient to serialize objects for communication or to save them for later use. However, deserialized data or code can often be modified without using the provided accessor functions if it does not use cryptography to protect itself. Furthermore, any cryptography would still be client-side security – which is a dangerous security assumption. Data that is untrusted can not be trusted to be well-formed. When developers place no restrictions on “gadget chains,” or series of instances and method invocations that can self-execute during the deserialization process (i.e., before the object is returned to the caller), it is sometimes possible for attackers to leverage them to perform unauthorized actions, like generating a shell.