8

Is there a way to execute Spark code locally with databricks-connect?

The reason is that I would like to execute some tests as part of my CI/CD pipeline without the need to have a cluster up and running.

1
  • I'm having a similar issue with this for unit-testing mocking spark_session. How you solved it? Commented Feb 24, 2021 at 17:58

1 Answer 1

5

No, databricks-connect requires a running cluster. If you do not use any databricks specific code (like dbutils) you can run spark locally and execute against that - assuming you can still access the data sources you need.

Sign up to request clarification or add additional context in comments.

1 Comment

Sad news as I will need to wait for DDL/DML available in Databricks to be supported in open source Delta Lake. Thanks.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.