11
from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext

conf = SparkConf().setAppName("Test").set("spark.driver.memory", "1g")
sc = SparkContext(conf = conf)

sqlContext = SQLContext(sc)

results = sqlContext.sql("/home/ubuntu/workload/queryXX.sql")

When I execute this command using: python test.py it gives me an error.

y4j.protocol.Py4JJavaError: An error occurred while calling o20.sql. : java.lang.RuntimeException: [1.1] failure: ``with'' expected but `/' found

/home/ubuntu/workload/queryXX.sql

at scala.sys.package$.error(package.scala:27)

I am very new to Spark and I need help here to move forward.

3 Answers 3

12

SqlContext.sql expects a valid SQL query not a path to the file. Try this:

with open("/home/ubuntu/workload/queryXX.sql") as fr:
   query = fr.read()
results = sqlContext.sql(query)
Sign up to request clarification or add additional context in comments.

Comments

5

Run spark-sql --help will give you

CLI options:
 -d,--define <key=value>          Variable subsitution to apply to hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
 -e <quoted-query-string>         SQL from command line
 -f <filename>                    SQL from files
 -H,--help                        Print help information
    --hiveconf <property=value>   Use value for given property
    --hivevar <key=value>         Variable subsitution to apply to hive
                                  commands. e.g. --hivevar A=B
 -i <filename>                    Initialization SQL file
 -S,--silent                      Silent mode in interactive shell
 -v,--verbose                     Verbose mode (echo executed SQL to the
                                  console)

So you can execute your sql script like this:

spark-sql -f <your-script>.sql

Comments

0

I'm not sure will it answer your question. But if you intend to run query on existing table you can use,

spark-sql -i <Filename_with abs path/.sql>

One more thing, if you have pyspark script you can use spark-submit details in here.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.