Is there a way to execute Spark code locally with databricks-connect?
The reason is that I would like to execute some tests as part of my CI/CD pipeline without the need to have a cluster up and running.
Is there a way to execute Spark code locally with databricks-connect?
The reason is that I would like to execute some tests as part of my CI/CD pipeline without the need to have a cluster up and running.
No, databricks-connect requires a running cluster.
If you do not use any databricks specific code (like dbutils) you can run spark locally and execute against that - assuming you can still access the data sources you need.