5

I have a MySQL table which includes a column that is AUTO_INCREMENT:

CREATE TABLE features (
  id INT NOT NULL AUTO_INCREMENT,
  name CHAR(30),
  value DOUBLE PRECISION
);

I created a DataFrame and wanted to insert it into this table.

case class Feature(name: String, value: Double)
val rdd: RDD[Feature]
val df = rdd.toDF()
df.write.mode(SaveMode.Append).jdbc("jdbc:mysql://...", "features", new Properties)

I get the error, Column count doesn’t match value count at row 1. If I delete the id column it works. How could I insert this data into the table without changing the schema?

1 Answer 1

4

You have to include an id field in the DataFrame, but its value will be ignored and replaced with the auto-incremented ID. That is:

case class Feature(id: Int, name: String, value: Double)

Then just set id to 0, or any number when you create a Feature.

Sign up to request clarification or add additional context in comments.

10 Comments

Sorry it's not a straight answer, but I barely understand the question. Hope it helps.
I have known why the error happened while I still don't how to do. Next I will try my best to explain clearly. In my table, I have 5 columns which include id column. However, in my DataFrame, I have only 4 columns, and my DataFrame will map MySQL Table just the first 4 columns and left the fifth columns doesn't map. With SQL, I know how to insert but with spark, I dont know how to do even I had search for the APIs. I hope you can understand or I can send you e-mail if you can understand Chinese.
Okay. So the question is basically how to use insertIntoJDBC with an auto_increment field. Try inserting 0 for the id. Maybe the database will sort it out.
I know it's bad to ask another unrelated question under this question, but could you tell how to follow you? I don't know how to follow people in stackoverflow
I don't think you can follow people on Stack Overflow. You can subscribe to a tag though! I recommend subscribing to apache-spark!
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.