0

I have a large (4 columns by about 900,000 lines) csv file that I need to convert to sql, and then obviously split it into more manageable sizes so that I can import it.

Does anyone know of a program to do this? Obviously I could do it manually, but with 900,000 lines it'll take a while and I'd much rather automate.

Edit for clarity: I have all the data in a .csv format on my Windows PC. I want to split then convert locally, then upload via phpmyadmin to an online server hosted elsewhere. Thanks

Cheers

6
  • 1
    Which OS? Which database? What does your data look like? Commented Aug 31, 2014 at 7:19
  • Do you want to fill one table with the four columns, or does the data need to go into multiple tables? Commented Aug 31, 2014 at 7:20
  • I have all the data in a .csv format on my Windows PC. I want to split then convert locally, then upload via phpmyadmin to an online server hosted elsewhere. Thanks Commented Aug 31, 2014 at 7:27
  • 2
    Which DBMS are you using? Some DBMS have the ability to directly access csv files as if they were tables ("external tables") Commented Aug 31, 2014 at 7:37
  • It will be imported into MySQL but currently I just have the .csv locally on my PC. I could import directly as .csv, but I prefer to use .sql files... I find I have less trouble that way. Thanks Commented Aug 31, 2014 at 7:40

2 Answers 2

1

I think you're going about this backward, since csv is such a trivial format you definately should split it in that form and then worry about SQL not the other way around.

To split it into multiple csv files it's easy in pretty much any language as long as you have access to a streaming API (to avoid reading all 900 000 rows at once), are you proficient in any language?

(C# / C++ / PowerShell / something else?)

What operating system are you running under? What SQL DBMS are you targeting? The solutions could vary a lot if you use say Windows with SQL server vs ubuntu with MySQL for example (depending on your DBMS there may be built in solutions to do this).

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks. I will be importing it into MySQL. I'm not overly proficient in any programming language. Most of my very limited skill is in webdev. Thanks.
As another comment suggested, you should import using the built in functionality to import csv from phpmyadmin or let us know why this isn't possible.
sorry, but this is not an answer
0

This is what I ended up doing:

I downloaded a .csv splitter and split my file into 9 files of 100,000 lines each. This was probably still slightly big, but it worked.

After splitting I converted them all to .sql and imported them to the database.

Obviously with the files being so large it took a long time to convert and import, but it worked.

I tried importing straight from the .csv, but the server wouldn't take it. Not entirely sure why, but hey - I got the data into my database now and that's all that I really wanted.

1 Comment

usually you would create a staging table, with all columns as varchar(max)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.