1

I want to create a table and load data into it in Snowflake from Databricks using Python/Scala.

Below is my code snippet. I'm getting the below error. How can I create the table first if not exists in Databricks notebook using Python or Scala and then load the data?

If so, what functions do I need to use? Below gives me an error.

df1.write.format("snowflake").options(sfOptions).option("dbtable","TEST_TABLE")
         .mode(SaveMode.Append)
2
  • Can you provide the error message? Commented Oct 12, 2021 at 11:37
  • Without an error message, this question is missing a minimal reproducible example. I will vote to close for now. Commented May 25, 2023 at 11:31

1 Answer 1

4

If you use Scala code then your df write should look like this:

df.write
    .format(SNOWFLAKE_SOURCE_NAME)
    .options(sfOptions)
    .option("dbtable", "t2")
    .mode(SaveMode.Append)
    .save()

If you use Python code then your df write should look like this:

df.write
    .format(SNOWFLAKE_SOURCE_NAME)
    .options(**sfOptions)
    .option("dbtable", "t2")
    .mode(SaveMode.Append)
    .save()

where:

SNOWFLAKE_SOURCE_NAME = "net.snowflake.spark.snowflake"

Observe there is a difference between the options on Scala vs Python.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.