0

I am messing around with some python and snowflake. I am using a JSON file to store my connection info like this:

{
  "snowflake_config": {
    "user":      "automation",
    "password":  "************",
    "account":   "someaccounthere",
    "warehouse": "somewarehousehere",
    "database":  "somedatabasehere",
    "schema":    "someschemahere"
  }
}

But I have a few hundred SQL queries that I would like to run that will accept arguments. Can I store them in a json file as well? Or does snowflake using execute_stream really require the file to be static? I was hoping I could do something like this (pseudo code):

{
 "snowflake_queries": {
  "sql1": "GRANT SELECT ON ALL TABLES IN SCHEMA %s<this is argparse1> to role <argparse2>",
  "sql2": "GRANT SELECT ON ALL VIEWS IN SCHEMA %s<this is argparse1> to role <argparse2>"
 }
}

1 Answer 1

2

I don't think execute_stream() can handle parameters, but why do you want to use it? execute_stream() is for verbatim execution of a script like Run All Queries in a Snowflake Web GUI worksheet. That means you can use eg. SQL variables as parameters in the script, but you can't use parameters from somewhere else unless you prepend eg. SET myvar = value; to the script.

I suggest you use just execute() or perhaps executemany(), which both take parameters.
In general, they work like this with a tuple of parameters:

snowflake_queries = {
  "q1": "GRANT SELECT ON ALL TABLES IN SCHEMA IDENTIFIER(:1) to role IDENTIFIER(:2)",
  "q2": "GRANT SELECT ON ALL VIEWS IN SCHEMA IDENTIFIER(:1) to role IDENTIFIER(:2)"
}

for key, sqlText in snowflake_queries.items():
  my_cursor.execute(sqlText, ((SomeSchema, SomeRole))) 

There are some commands and functions that have parts that cannot be substituted by parameters yet. Those are (Dec 2019) import/export command parameters and notably time travel eg. AT(OFFSET => n) and the GENERATOR(ROWCOUNT => n) function.
Then you have to do string formatting yourself in the sqlText.

In other commands it is possible to use non-literal parameters with the IDENTIFIER() wrapper, eg. schema/table and user/role names. I've used this in the code above.

A final word:
The case above (an administrative helper script) is a perfect candidate for a Stored Procedure.

Sign up to request clarification or add additional context in comments.

2 Comments

I was looking at args because I would like args passed to the query command line like python some_python.py --firstarg SomeSchema --secondarg SomeRole to then be passed like what I mentioned above to the json. I guess I can use execute to just iterate over the queries in json. The parameters to the queries should come from the args passed.
Sure. Like the following: for key, sqltext in snowflake_queries.items(): my_cursor.execute(sqltext, ((SomeSchema, SomeRole)))

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.