0

I have an CSV file with 7 columns, which a user has to upload so it can be added in the database. I found some help in reading the CSV and putting all the info in a single table, however, the data has to be spread over three tables.

My code for inserting all the data to 1 table:

        protected void Upload(object sender, EventArgs e)
    {
        //Upload and save the file
        string csvPath = Server.MapPath("~/Temp/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
        FileUpload1.SaveAs(csvPath);

        DataTable dt = new DataTable();
        dt.Columns.AddRange(new DataColumn[7] { 
        new DataColumn("Title", typeof(string)),
        new DataColumn("Artist", typeof(string)),
        new DataColumn("Years", typeof(string)),
        new DataColumn("Position", typeof(string)),
        new DataColumn("Senddate", typeof(string)),
        new DataColumn("Sendfrom", typeof(string)),
        new DataColumn("Sendtill", typeof(string))});


        string csvData = File.ReadAllText(csvPath);
        foreach (string row in csvData.Split('\n'))
        {
            if (!string.IsNullOrEmpty(row))
            {
                dt.Rows.Add();
                int i = 0;
                foreach (string cell in row.Split(';'))
                {
                    dt.Rows[dt.Rows.Count - 1][i] = cell;
                    i++;
                }
            }
        }

        string consString = ConfigurationManager.ConnectionStrings["connection"].ConnectionString;
        using (SqlConnection con = new SqlConnection(consString))
        {
            using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))
            {
                //Set the database table name
                sqlBulkCopy.DestinationTableName = "dbo.ingevoerd";
                con.Open();
                sqlBulkCopy.WriteToServer(dt);
                con.Close();
            }
        }
    }

As you can see, it takes 7 columns, and puts them in the table [dbo].[ingevoerd]

How can i split the data to put the column 'Title' and 'Years' in a table called Song, 'Artist' in a table called Artiest, and 'Position', 'Senddate', 'Sendfrom' an 'Sendtill' in a table called Lijst?

For more information, put down a comment.

1
  • imho you should use SSIS for this task instead. Commented Jun 4, 2015 at 10:11

2 Answers 2

1

imho this is not the best way to handle this upload because the content is not flat data you can bulk upload in a breeze; there are many entitiest (at least 3) that should be linked.

i would go with the 'old style' approach of calling a insert for each row with proper parameters.

you are already looping through the Whole recordset when reading the CSV so i would make something like:

    protected void Upload(object sender, EventArgs e)
    {
        //Upload and save the file
        string csvPath = Server.MapPath("~/Temp/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
        FileUpload1.SaveAs(csvPath);

        string consString = ConfigurationManager.ConnectionStrings["connection"].ConnectionString;
        using (SqlConnection con = new SqlConnection(consString))
        {
            con.Open();
            using (SqlTransaction tran = con.BeginTransaction())
            using (SqlCommand cmd = new SqlCommand())
            {
                cmd.Connection = con;
                cmd.Transaction = tran;
                cmd.CommandType = System.Data.CommandType.StoredProcedure;
                cmd.CommandText = "your_sp_name_here";
                cmd.Parameters.Add(new SqlParameter("@title",System.Data.SqlDbType.NVarChar));
                cmd.Parameters.Add(new SqlParameter("@artist", System.Data.SqlDbType.NVarChar));
                // other parameters follow
                // ...

                string csvData = File.ReadAllText(csvPath);
                foreach (string row in csvData.Split('\n'))
                {
                    if (!string.IsNullOrEmpty(row))
                    {
                        // for every row call the command and fill in the parameters with proper values
                        cmd.Parameters["@title"].Value = row[0];
                        cmd.Parameters["@artist"].Value = row[1];
                        // ...
                        cmd.ExecuteNonQuery();
                    }
                }

                // when done commit the transaction
                tran.Commit();
            }
        }
    }

inside your stored procedure handle the 'split' of the data in the relevant tables taking all the steps required to avoid duplicates and maybe linking the data among the tables:

create procedure your_sp_name_here(@title nvarchar(50), @artist nvarchar(50), @year int)
as
begin
 -- add logic & checks here if needed
 -- ...
 -- ...

 -- if everything is ok insert the rows
 insert into songs (title, year) values (@title, @year)
 insert into Artiest (Artist) values (@artist)
end
Sign up to request clarification or add additional context in comments.

3 Comments

Thanks for your answer, i will take a look at this and try to implement it!
When you have a lot of data I wrould also recommend using a StreamReader instread of reading all content
another option is to use a custom library that greatly simplify the interaction with CSV files
0

Have you looked into column mappings?

Check out stackoverflow.com/questions/17469349/mapping-columns-in-a-datatable-to-a-sql-table-with-sqlbulkcopy

1 Comment

Nope, i am not known with column mappings, gonna look at that though.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.