0

While trying to connect to MySql database in RDS from EMR Jupyter Notebook, I have found the following error :

Code Used:

from pyspark.sql import SparkSession
hostname="hostname"
dbname = "mysql"
jdbcPort = 3306
username = "user"
password = "password"
jdbc_url = "jdbc:mysql://{0}:{1}/{2}?user={3}&password={4}".format(hostname,jdbcPort, dbname,username,password)
query = "(select * from framework.File_Columns) as table1"
df1 = spark.read.format('jdbc').options(driver = 'com.mysql.jdbc.Driver',url=jdbc_url, dbtable=query ).load()
df1.show()

Error message:

An error occurred while calling o89.showString. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, ip-172-31-37-50.us-west-2.compute.internal, executor 1): java.lang.ClassNotFoundException: com.mysql.jdbc.Driver

I have downloaded the required mysql-connector-java-5.1.47.jar to /home/hadoop/mysql-connector-java-5.1.47.jar and have updated the Spark configuration file as follows:

spark.master                     yarn

spark.driver.extraClassPath      :/usr/lib/hadoop-lzo/lib/*:/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/*:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/*:/usr/share/aws/emr/emrfs/auxlib/*:/usr/share/aws/emr/goodies/lib/emr-spark-goodies.jar:/usr/share/aws/emr/security/conf:/usr/share/aws/emr/security/lib/*:/usr/share/aws/hmclient/lib/aws-glue-datacatalog-spark-client.jar:/usr/share/java/Hive-JSON-Serde/hive-openx-serde.jar:/usr/share/aws/sagemaker-spark-sdk/lib/sagemaker-spark-sdk.jar:/home/hadoop/extrajars/*:/home/hadoop/extrajars/mysql-connector-java-5.1.47.jar

spark.driver.extraLibraryPath    /usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native:/home/hadoop/extrajars/*:/home/hadoop/extrajars/mysql-connector-java-5.1.47.jar

spark.executor.extraClassPath    :/usr/lib/hadoop-lzo/lib/*:/usr/lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/*:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/*:/usr/share/aws/emr/emrfs/auxlib/*:/usr/share/aws/emr/goodies/lib/emr-spark-goodies.jar:/usr/share/aws/emr/security/conf:/usr/share/aws/emr/security/lib/*:/usr/share/aws/hmclient/lib/aws-glue-datacatalog-spark-client.jar:/usr/share/java/Hive-JSON-Serde/hive-openx-serde.jar:/usr/share/aws/sagemaker-spark-sdk/lib/sagemaker-spark-sdk.jar:/home/hadoop/extrajars/*:/home/hadoop/extrajars/mysql-connector-java-5.1.47.jar

spark.executor.extraLibraryPath  /usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native:/home/hadoop/extrajars/*:/home/hadoop/extrajars/mysql-connector-java-5.1.47.jar

Is there anything more that I have to do, so as to connect to the MySql DB from the Jupyter Notebook?

2 Answers 2

3

As it's unable to find driver class when you are running it from Jupyter Notebook, to avoid that you can try by copying mysql-connector-java-5.1.47.jar to the $SPARK_HOME/jars folder. It will resolve your driver issue as per my personal experience.

Sign up to request clarification or add additional context in comments.

Comments

0

You can also do this :

spark.conf.set("jars", "s3://bucket-name/folder-name/mysql-connector-java-5.1.38-bin.jar")

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.