Usage
Configuring the OpenLineage Spark integration is straightforward. It uses built-in Spark configuration mechanisms. However, for Databricks users, special considerations are required to ensure compatibility and avoid breaking the Spark UI after a cluster shutdown.
Your options are:
- Setting the properties directly in your application.
- Using
--conf
options with the CLI. - Adding properties to the
spark-defaults.conf
file in the${SPARK_HOME}/conf
directory.